Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Ladna

Linux binary compatibility (basically the same as the last thread on this subject)

Recommended Posts

Yeah I agree, but we're exploring ways to verify that a user is running an "unmodified" binary - really only for competitive settings like a tournament or league - and that almost certainly rules out self-compiled binaries. You're right though, most Linux users will undoubtedly build from source, and for non-competitive purposes there should be no problem with that.

Share this post


Link to post

Just to be curious, did you perform any tests so far? Were there any particular distros that didn't work with a particular "universal" binary? I mean, the very existence of this poll means that it's not a given, despite what zome linux zealots would say.

Edit: a quick google search about the argument brings up the usual stereotyped flame-fueling answers:

  1. The One True Way is to compile from source.
  2. There's no such thing as a "universal" Linux Binary. The One True Way is to compile from source.
  3. Use the LSB. But The One True Way is to compile from source.
  4. It can be done, but Stallman doesn't like it. The One True Way is to compile from source.
  5. Static linking is evil. The One True Way is to compile from source.
  6. Use a VM/interpreted language. But The One True Way is to compile from source.
  7. Did we mention that The One True Way is to compile from source?
On a more serious side, these answers seem spot-on but they are from 2009, so I don't know if things have improved dramatically in the meantime:

http://stackoverflow.com/questions/1522990/whats-the-best-way-to-distribute-a-binary-application-for-linux

http://stackoverflow.com/questions/1209674/shipping-closed-source-application-for-linux

Share this post


Link to post
Ladna said:

Yeah I agree, but we're exploring ways to verify that a user is running an "unmodified" binary - really only for competitive settings like a tournament or league - and that almost certainly rules out self-compiled binaries. You're right though, most Linux users will undoubtedly build from source, and for non-competitive purposes there should be no problem with that.


ezQuake already tried client-side verification with a security module and it just resulted in drama.

The developers ended up discontinuing the security module for ezQuake 1.9 and just rely on the client to tell the truth with the f_* security checks and ruleset settings. I think those are about as far as you should bother with. If you do decide to go with a closed-source security module for whatever reason, know that you can't distribute it with the port itself without violating the GPL.

Share this post


Link to post

Whooooooooooooa thread derailment. I made a thread about binary verification here.

Re: LSB stuff

I'm not that interested in making DEB/RPM packages, LSB-compliant or not. To a certain degree I'm interested in making it easy for packagers to package EE, but that's far from a major concern of mine. EE requires the EXE, a base folder with EDFs & configuration files, and an IWAD. I can bundle the EXE, the EDFs, and Freedoom, and a small script that attempts to associate the "eternity:" protocol with that EXE. Where the user/packager ultimately chooses to put stuff is up to them.

And yeah I'm dogfooding this. I compile the server binaries used for testing on 32-bit CentOS 5.6, and I run them on 64-bit Debian 6.0.

Finally, this is solely for people who feel uncomfortable compiling from source (which is a good deal of Linux users), people who are simply unable to for various reasons, and competitive scenes that have a strong interest in preventing players from playing with unofficial binaries.

Share this post


Link to post
Maes said:

  • There's no such thing as a "universal" Linux Binary. The One True Way is to compile from source.


  • Kinda hard to make a binary that works across all different CPU architectures. Like, will this run on my ARM-9 handheld? ;)

    I guess he wants to help newbies with Wintel boxes that got Ubuntu dual-boot setup or something similar, but the distros can do a much better job at making sure it's built right for their environment (and test it along with all their other packages).

    I think the random binary from Internet is just a remnant of the Win32 world.

    Share this post


    Link to post

    Yeah of course different CPU architectures will need separate binaries. I don't have a way to build SPARC or Itanium binaries, but I don't think anyone would try to play competitively on one of those machines anyway. ARM I can easily do (hey Ladna... is that a sexy Nokia N900 you have there....?1?!).

    Share this post


    Link to post
    Graf Zahl said:

    In my opinion any OS that can't run a 'random binary' is broken by default.


    Linux can run 'random binaries' as long as the libraries line up.

    Share this post


    Link to post

    Just as well as Windows, for that matter. The whole issue in the thread was basically to figure out what a reasonable "lowest common denominator" would be, and he's settled on CentOS 5.

    Share this post


    Link to post
    hex11 said:

    Kinda hard to make a binary that works across all different CPU architectures. Like, will this run on my ARM-9 handheld? ;)


    Lol, bad choice of terms on my side :-p

    Yeah, of course I didn't mean an opcode-level universal or "fat" binary that will run on anything from Z80 to PowerPC (however there ARE projects trying to do just that), but merely the Linux equivalent of the "compile once, run anywhere*" good old DOS/Win32 binary.

    * "Anywhere" means that a Win32 binary should run directly on anything from Win95 to Windows 7 "as is". To a lesser extent, you can run Win16 binaries in that same spectrum of M$ OSes. My understanding is that a true 1:1 equivalent to such a beast does not exist on the *nix world in general, let alone on Linux, even on the same CPU architecture.

    Share this post


    Link to post
    Maes said:

    * "Anywhere" means that a Win32 binary should run directly on anything from Win95 to Windows 7 "as is". To a lesser extent, you can run Win16 binaries in that same spectrum of M$ OSes. My understanding is that a true 1:1 equivalent to such a beast does not exist on the *nix world in general, let alone on Linux, even on the same CPU architecture.


    It's actually very possible. You can still old binaries like Quake 3 Arena on modern Linux...granted they'll have compatibility issues in some aspects and most people recommend using ioquake3, but it's still possible.

    Library issues are library issues no matter what platform you're on (see the Visual C++ redistributables mess on Windows). The bigger issue on Linux is figuring out where files go on a particular distro and how to put things into the program menu automatically, giving people the same effortless one-click install that a user might get installing from their native distro repository. However, you can simply give them a tar.bz2 that you tell them to install into /opt or $HOME/some-directory and trust that they know how to create a shortcut on their desktop.

    Share this post


    Link to post
    AlexMax said:

    It's actually very possible.


    ...but avoided like the plague, or restricted to a relatively small subset of programs, for some reason. Usually those following the LSB closely or trivial enough not to spark library issues.

    AlexMax said:

    Library issues are library issues no matter what platform you're on (see the Visual C++ redistributables mess on Windows).


    Yeah, those are an undeniable part of computer lore ;-)
    Still, on Windows the most common praxis is to do exactly what's so reviled on Linux: static linking and/or distributing required libraries with each application (not distributing a single DLL is just shitty practice, even on Windows. Stuff that requires Java or .NET is another matter). It may not be a Good Thing, but it "just works" (most of the time).

    AlexMax said:

    However, you can simply give them a tar.bz2 that you tell them to install into /opt or $HOME/some-directory and trust that they know how to create a shortcut on their desktop.


    Until it becomes common praxis for the Linux community to distribute shit in a "click once, install automatically, create shortcuts" fashion just like installer .exes or .msi on Windows (perhaps with a reasonably "universal" binary as a bootstrapper for the process), it will never become a mainstream desktop OS. Hell, I'd be happy if at least the Mac OSX installation model would be accepted (dragging a huge-ass icon into a folder, and the installer transparently doing the rest)

    Share this post


    Link to post
    Maes said:

    Until it becomes common praxis for the Linux community to distribute shit in a "click once, install automatically, create shortcuts" fashion just like installer .exes or .msi on Windows (perhaps with a reasonably "universal" binary as a bootstrapper for the process), it will never become a mainstream desktop OS.


    What exactly do you think a .deb or an .rpm is? A package _is_ an installer, and it's often easier to install than a Windows program is, due to there not being any extra steps like picking an installation directory. It's not common practice because Linux users are assumed to know what they're doing, there are less of them, and developers are lazy. Installing software on Linux is a solved problem.

    To swerve a bit offtopic, not being a mainstream Desktop OS is more the fault of many popular pieces of software not being available for it, such as Word and Photoshop. This isn't something that is likely to change, but in the face of smartphones and tablets (two markets that Linux does have a strong foothold in, in the form of Android), I don't think it really matters in the long run. People always poked fun at people who predicted year X would be the year of Linux on the desktop, but it's been the year of Linux on everything except the desktop for a few years now, and the desktop is rapidly shrinking in relevance.

    Hell, I'd be happy if at least the Mac OSX installation model would be accepted (dragging a huge-ass icon into a folder, and the installer transparently doing the rest)


    That's actually a really bad example. A .dmg is just a glorified zip file, and the .app bundle is just a glorified program folder. Most big pieces of software like Photoshop have traditional installers, and Apple is pushing the Mac App store because they know the 'app drag' is easy for novices to screw up.

    Share this post


    Link to post
    AlexMax said:

    What exactly do you think a .deb or an .rpm is?


    Good. Some distros down (the ones that use them), several more to go. See the problem?

    AlexMax said:

    It's not common practice because Linux users are assumed to know what they're doing


    OK, that's both good and bad. Back in the DOS days, users were -to a large degree- supposed to know what they were doing, been there done that, so I can understand this point of view. On the other hand, it voids all those "Linux is great for grandma PC" arguments.

    AlexMax said:

    it's been the year of Linux on everything except the desktop for a few years now, and the desktop is rapidly shrinking in relevance.


    Doesn't the fact that Linux is successful in the mainstream only where it is not readily visible ring a bell?

    It just reinforces the notion that it's something that's better kept in the IT basement or the "engine room", outside the reach of "unauthorized personnel", just like it always was the case with traditional UNIX.

    While I sympathize with that view of things (in fact, I believe that computers should be off-limits to anyone but white-coat lab scientists and chartered engineers or at least licensed and trained people, while "the rest of us" shouldn't interact with anything more complex than an arcade game), it doesn't change the fact that Linux is merely a bare "frame" so to speak, upon which some enthusiasts (also) built something that looks like a user-friendly desktop OS, without this really being its true nature or primary purpose.

    Share this post


    Link to post
    Maes said:

    Good. Some distros down (the ones that use them), several more to go. See the problem?


    The vast majority of distributions use one of those two package formats, and all of the ones targeted for Joe Blow do. Quite simply, if your distro doesn't use dpkg or rpm, you're already an advanced user that knows what to do without them.

    Maes said:

    Doesn't the fact that Linux is successful in the mainstream only where it is not readily visible ring a bell?


    Hey, you have a valid point here. It's that most users care only about their shit just working and not what powers it underneath, see for example the fact that Android devices are never advertised with "Runs Linux!" -- Linux was used because it is the best kernel to build upon, but users don't care; hell even most developers don't care since Dalvik hides all of the Linuxness from you anyway, the kernel could be swapped (and has been, as is the case with RIM's QNX-based BlackBerry PlayBook with Android compatibility) and as long as Dalvik runs, apps and developers don't care.

    Maes said:

    it doesn't change the fact that Linux is merely a bare "frame" so to speak, upon which some enthusiasts (also) built something that looks like a user-friendly desktop OS, without this really being its true nature or primary purpose.


    Not entirely sure if you can discount corporations like Red Hat, Novell, and IBM hiring and paying developers full-time to provide the majority of desktop environment development and usability testing as mere "enthusiasts". If you can, then I'll just say Windows is built by enthusiasts too (I think Windows 7 lacks polish but hey!).

    Share this post


    Link to post
    Maes said:

    Good. Some distros down (the ones that use them), several more to go. See the problem?

    No? Just those two cover Debian/Ubuntu/Mint and Fedora/RHEL/CentOS/Scientific Linux/openSUSE/Mandriva. The only three popular exceptions I can think of that don't use either are Slackware, Gentoo and Arch, and those are for advanced users. Besides, even if you do decide to support more than just .rpm and .deb, why do you as the software author care? You have an automated way of building packages just like you do your Windows installers, right? Right?

    On the other hand, it voids all those "Linux is great for grandma PC" arguments.

    Grandma doesn't need to install anything, ever, as long as her system updates itself.

    It doesn't change the fact that Linux is merely a bare "frame" so to speak, upon which some enthusiasts (also) built something that looks like a user-friendly desktop OS, without this really being its true nature or primary purpose.

    "True nature?" "Primary purpose?" It's a kernel! Of course it doesn't care about an all-in-one installer that somehow magically works in all userlands. That's not what _any_ kernel is designed to do.

    *sigh*

    The reason Linux hasn't caught on has nothing to do with trivial things like userland differences between the distros preventing everyone from agreeing on one installer and everything to do with the fact that Windows has tons of inertia and is "good enough" for most software and most end-users don't pay for Windows directly so they don't mind how much it costs.

    In the meantime, it is very possible for someone to create a Linux binary that works out of the box on 99% of x86 GNU/Linux systems in use, and very possible for someone to create two packages (four if you want to distinguish between x86/x86_64) that someone can double-click on and just work on 90% of x86 all GNU/Linux systems between them.

    Share this post


    Link to post
    AlexMax said:

    To swerve a bit offtopic, not being a mainstream Desktop OS is more the fault of many popular pieces of software not being available for it, such as Word and Photoshop. This isn't something that is likely to change



    Correct diagnosis but you still draw the wrong conclusion.

    Why do you think so little commercial software is developed for Linux. I don't think it has anything to do with the size of the user base, but more with the hassle to get it to work everywhere, not to mention the 'I want my source' attitude that seems to run rampant in the Linux crowd.

    Not the best infrastructure for commercial development in my opinion.

    (Anyway, who needs Word? I don't even use it on Windows... :P)

    Share this post


    Link to post

    AlexMax said:
    The only three popular exceptions I can think of that don't use either are Slackware, Gentoo and Arch

    Slackware can install both .deb and .rpm packages. This is how I run Google Earth, Picasa and the drivers for a network printer/scanner.

    I don't know about Arch, but Gentoo users wouldn't be using a precompiled binary, they'd build a custom one with fine-tuned (i.e. broken) compiler flags for their particular system. If they actually have time to play games between "emerge world" invocations, that is...

    Share this post


    Link to post
    Graf Zahl said:

    Why do you think so little commercial software is developed for Linux. I don't think it has anything to do with the size of the user base, but more with the hassle to get it to work everywhere.

    I don't buy that. Developers that do support linux are under no obligation to make it run perfectly in every possible configuration. Linux being what it is has many more oddball configurations, but most developers who sell commercial software that I know of pick one or a few distributions to officially support and tell everyone else that they're on their own. As long as you support Ubuntu and one or two others, you're fine.

    not to mention the 'I want my source' attitude that seems to run rampant in the Linux crowd.

    This I think is a better explanation, and probably part of the reason, but still not quite there, given how few people actually complain about running non-free video drivers in practice. Developers don't port software to Linux if they think that the amount they get back from selling it to Linux users is less than the amount they put into porting it. Although you could sell something like IDEA, Sublime Text 2 or UltraEdit to a professional developer who needs a good development environment, or something like Mathmatica or Maple to mathematicians and engineers who need it, the number of Linux users willing to pay for a game like Half Life 2 or a word processor like Word Perfect must not be enough for the financials to work out.

    To put it bluntly, the users in those markets don't really care about what operating system their software runs on, and hobbyists who run Linux don't want to pay for software period, meaning they would either not buy it outright or pirate it.

    Share this post


    Link to post

    Nah, many Linux users will hapilly run binaries w/o source, and you can see that with some video drivers (esp. Nvidea), wifi blobs (lots of those), various commercial games, commercial databases (Oracle), and even a bunch of office suites (before OpenOffice got popular). Heck, doesn't FMOD ring any bells?...

    Old time commercial Unix systems of course also had a bunch of close-source software as well.

    I'm a bit different in the sense that the only closed software that's acceptable to me is really old stuff like 8/16-bit games and similar stuff that has some kind of historical and/or emotional value to me, and those only get run under emulation. But probably other than a small number of OpenBSD users, and some of the more die-hard Debian/GNU Linux users, you'll find that commercial closed-source software is fine, so long as it brings something interesting and of value to the table. Otherwise, hey there's already tons of free stuff out there...

    Share this post


    Link to post

    hex11 said:
    OpenBSD users

    Oh yeah, rpm tools run on OpenBSD, too. (But binaries would of course only be useful using Linux emulation on i386.) So that and .deb covers pretty much everything...

    Share this post


    Link to post
    AlexMax said:

    the number of Linux users willing to pay for a game like Half Life 2 or a word processor like Word Perfect must not be enough for the financials to work out.

    To put it bluntly, the users in those markets don't really care about what operating system their software runs on, and hobbyists who run Linux don't want to pay for software period, meaning they would either not buy it outright or pirate it.

    this is just flat-out false. Most people aren't using Linux merely because they won't pay for software, and most certainly aren't objected to it. In fact, before OpenOffice got popular, plenty of office suites were sold for Linux, including Corel WordPerfect.

    The thing that only seems like people aren't paying for Linux software is the general lack of for-pay software that's actually targetted at anything larger than a niche market. VMware, Mathematica, and bunches of others are small markets to begin with. Even Adobe Photoshop, often mention, wouldn't get much sales because of being rather niche; graphics-oriented Linux users are likely already using the GIMP or just running Photoshop in Wine if they have it.

    The Humble Indie Bundles pretty much single-handedly debunk the whole "Linux users are cheap asses that won't pay for anything!" myth. Though as for your Half-Life 2 example, if the full game actually appeared for Linux, Valve would most likely just port Steam to Linux and all your games already purchased, that have Linux versions, would appear without having to rebuy them, as when they ported Steam to Mac OS; I'm sure pretty much everyone that wants to play Half-Life 2 has already bought and owns it...

    Share this post


    Link to post
    AlexMax said:

    No? Just those two cover Debian/Ubuntu/Mint and Fedora/RHEL/CentOS/Scientific Linux/openSUSE/Mandriva. The only three popular exceptions I can think of that don't use either are Slackware, Gentoo and Arch, and those are for advanced users.


    Well, there's always Murphy's law applied to Linux. That One Program/Wunder-utility/whatever you happen to need will not be in ANY repo, will only be distributed from source (or in some package format that your distro of choice can't handle no matter what), and compiling it will not be just a matter of typing './configure' and 'make' ;-)

    AlexMax said:

    Grandma doesn't need to install anything, ever, as long as her system updates itself.


    That's all fine and dandy if she indeed is never supposed to install anything new and rely only on automated updates (or the administrator, namely you) to install some new stuff that she may fancy or fix stuff that breaks. The problems will start when her friends start asking her to chat via Skype or some other software that's windows-only (or which has a far superior/functional Windows version) or when the funderful powerpoints and "fun" exes her friends send her don't display all that correctly or do not "click and run". Have fun explaining her why and putting up with all the bitching and moaning about wanting a "real" computer like her friends do ;-)

    Of course, if you are able and willing to unilaterally impose to grandma 100% what she will be able to use and what she will not in non-negotiable terms, disregard the above.

    Many people mentioned:
    Commercial software and *nix/Linux


    To be frank, the situation isn't better or worse than the times of "real" Unices. Yeah, of course there was commercial software (some of it exclusive, too) for them, no matter how easy or hard it was to actually achieve the fabled source compatibility that every "good" *nix program should have (at least within a certain revision/standard).

    And of course not ALL possible variations or architectures were supported (e.g. supporting System VR4 didn't automatically mean that it would run seamlessly on something like this just by compiling natively, even if there was a namesake match).

    About Word: yeah right, try applying for a clerical job without knowledge (or even official certification for it, at least in the EU with the ECDL and all) or explaining to some HR pointy-haired that OpenOffice/LibreOffice is equivalent or better ;-)

    Share this post


    Link to post

    Ah, this old chestnut.

    In theory, shipping Linux binaries is no more difficult than shipping Windows binaries. The kernel ABI is very stable, and so is the standard C library, and if you include any additional dependencies as your own .so files (the Linux equivalent of DLLs) in your package, it will run anywhere. Alternatively you can use statically linked binaries to get the same result. Some commercial software like VMware is distributed exactly like this.

    The problem is that that isn't really how things are "supposed" to be done on Linux. What you're supposed to do is use packages (.rpm, .deb, or other), and your package is supposed to depend on the packages for the libraries you depend on. When this works, it's great, and the whole experience is actually much, much nicer than on Windows - just apt-get install anything and with a single command, in a few minutes it's on your computer.

    Unfortunately, it's a huge pain for someone who wants to make and ship a third party package, because (1) different distributions use different packaging systems (dpkg vs. rpm vs. other), (2) when they use the same packaging system, they can have different names for the same package (SDL_mixer on Fedora, libSDL_mixer on Mandriva, libsdl-mixer1.2 on Debian), (3) different distributions ship different versions of libraries.

    For problem (1), you can get probably 95% of the market by shipping a .rpm and a .deb package - it's fair to assume that everyone else is knowledgeable enough to figure things out for themselves. Problem (2) isn't as bad as it seems either - in the .deb world, package names are pretty consistent. You can make .rpm files that depend on files rather package names, so they aren't tied to a specific distribution.

    Problem (3) is more of a problem. Suppose you depend on library libfoo; in 2008, libfoo adds a new API function. Now, in 2011, can you use that function? If you're on a fast-moving distro like Fedora or Ubuntu, the answer is almost certainly yes. If you're on a "stable" distribution like Debian stable or Redhat Enterprise, the answer might be no - they have packages derived from old, tested versions that might be 3-5 years old. It's also worth noting that some users might still be using very old versions of distributions anyway.

    What I recommend doing is compiling your program on an old version of a distro - set up a VM running a system from 3-4 years ago and get it running on there. Most systems will run old binaries with no problems - the maintainers of glibc and Gtk+ have spent a lot of effort into maintaining backwards compatible ABIs, for example. The key thing is not to compile it on your latest, shiny, cutting edge Ubuntu install. If it runs on a system from four years ago, it's safe to assume it will run on pretty much all modern systems.

    Share this post


    Link to post

    I've seen widely contrasting views on the argument being vehemently defended around the internet, which shows how much of a serious business Linux can be. Like most of IT, there are plenty of opposing schools of thought, and each may have some merit as long as they actually get work done. E.g. I once had a bitter flamewar with some other IT guy on my not using network boots or recovery images (PROTIP: they were useless features in my corporate network, as there were no two machines alike, and all they could cause were unexplicable midnight boots).

    I can sum most of these debates up as:

    • A: In my approach to Z, I do X.
    • B: WHAT?!!! We don't do X in MY neck of the woods! We FIRE people like A who do X when dealing with Z! We do Y instead.
    • C: Sometimes I do X and sometimes I do Y. Both are good when dealing with Z, depending on the circumstances. No harm done. Everything is safe.
    • A & B: HELL NO!!!! YOU ONLY DO X OR Y!!!! NO COMPROMISE!!!!
    I call B's approach the "We don't like strangers in our town, damn cattle thief!" mentality. A may or may not be aware of approach Y, or it may simply not apply to him but be attacked for it nonetheless. C is usually the only guy who can balance between the two (but C-types are often an invisible minority in such debates). They often end up with both A and B thinking that the other is utterly incompetent and that they would gauge their eyes out if they ever had to cooperate on anything involving X,Y or Z ;-)

    Share this post


    Link to post
    Maes said:

    Well, there's always Murphy's law applied to Linux. That One Program/Wunder-utility/whatever you happen to need will not be in ANY repo, will only be distributed from source (or in some package format that your distro of choice can't handle no matter what), and compiling it will not be just a matter of typing './configure' and 'make' ;-)

    Which is precisely why you support a few popular distributions and let everyone else sort their own mess out.

    That's all fine and dandy if she indeed is never supposed to install anything new and rely only on automated updates (or the administrator, namely you) to install some new stuff that she may fancy or fix stuff that breaks. The problems will start when her friends start asking her to chat via Skype or some other software that's windows-only (or which has a far superior/functional Windows version) or when the funderful powerpoints and "fun" exes her friends send her don't display all that correctly or do not "click and run". Have fun explaining her why and putting up with all the bitching and moaning about wanting a "real" computer like her friends do ;-)

    Of course, if you are able and willing to unilaterally impose to grandma 100% what she will be able to use and what she will not in non-negotiable terms, disregard the above.

    Are you insane? Of course I don't want Hypothetical Grandma running random crap off the internet, that's the entire reason why I gave her a Linux box and not Windows XP. Any distro I would set Hypothetical Grandma up with would have Skype working fine and LibreOffice installed before I left. If she actually has a usage pattern that requires Windows, I would set it up for her...once...and tell her that because it's so easy for it to get infected with Viruses that she'll have to send it to the local computer shop to get it cleaned, because I just don't have time to clean up Windows installs.

    About Word: yeah right, try applying for a clerical job without knowledge (or even official certification for it, at least in the EU with the ECDL and all) or explaining to some HR pointy-haired that OpenOffice/LibreOffice is equivalent or better ;-)

    Can't speak for Graf, but you'll never see me claim LibreOffice is better than Microsoft Office, especially when it comes to opening common Office documents. I don't think that Office is worth $125 for me personally though, and I don't care enough about pixel-perfect precision or the extra features it offers enough to pay for it. If it just so happens to appear on my work PC because someone else paid for it, fantastic, otherwise, Google Docs and LibreOffice are fine.

    fraggle said:

    What I recommend doing is compiling your program on an old version of a distro - set up a VM running a system from 3-4 years ago and get it running on there. Most systems will run old binaries with no problems - the maintainers of glibc and Gtk+ have spent a lot of effort into maintaining backwards compatible ABIs, for example. The key thing is not to compile it on your latest, shiny, cutting edge Ubuntu install. If it runs on a system from four years ago, it's safe to assume it will run on pretty much all modern systems.


    Ding ding ding! I dunno how I missed saying this explicitly earlier, but this is precisely how you create portable packages on Linux. My distro picks for the two virtual machines would be one running CentOS 5.6 with added EPEL repositories and another running Debian 5.0.

    Share this post


    Link to post
    AlexMax said:

    Are you insane? Of course I don't want Hypothetical Grandma running random crap off the internet, that's the entire reason why I gave her a Linux box and not Windows XP. Any distro I would set Hypothetical Grandma up with would have Skype working fine and LibreOffice installed before I left.


    You are absolutely right as far as security is concerned. As I have said in the past, setting a Linux (or for that matter, any non-Windows computer) makes one automatically immune to a 99.9999% of all current malware. But you could have achieved the same thing with a non-networked computer, a Mac, a game console, a WebTV set-top box or a DOS machine (well, assuming she doesn't find Yankee Doodle and Stoner infected floppies ;-), which brings me to my next point:

    OK, you installed Libreoffice....and she discovers that the DOC or "Funny pics" PPT her friends sent her doesn't display correctly. ANd I don't mean "off by one pixel", I mean from visibly off to a fucking mess. How do you think she'll react, especially if "her friends can see it just fine"? As superficial as it sounds, that's a clear Strike against free software, and a non-indifferent reason of why it's not so widespread.

    The above incompatible PPT/DOC examples are just the tip of the iceberg. Then come those "interactive CDs for the whole family" (autorun.exe... ouch), exFat formatted flash drives (if you know of a way to automate mounting those, I'm all ears), cheapo chinese hardware that "just works" when plugged in her friends' PCs but not hers etc. etc.

    So, from my point of view, you're making a tradeoff or even a gamble: you guarantee yourself a relative amount of tranquility from not having to worry about viruses etc., while making it harder, more obscure and more restrictive for Hypothetical Grandma (and possibly be bombarded with "Why doesn't this or that work?!" kind of calls.

    This may be acceptable in a controlled work environment where the user has no choice or saying, but it's not so readily applicable to "free" home users, at least not after they discover what they are missing by not using a mainstream OS and related software.

    Share this post


    Link to post
    Maes said:

    OK, you installed Libreoffice....and she discovers that the DOC or "Funny pics" PPT her friends sent her doesn't display correctly. ANd I don't mean "off by one pixel", I mean from visibly off to a fucking mess. How do you think she'll react, especially if "her friends can see it just fine"? As superficial as it sounds, that's a clear Strike against free software, and a non-indifferent reason of why it's not so widespread.

    The above incompatible PPT/DOC examples are just the tip of the iceberg. Then come those "interactive CDs for the whole family" (autorun.exe... ouch), exFat formatted flash drives (if you know of a way to automate mounting those, I'm all ears), cheapo chinese hardware that "just works" when plugged in her friends' PCs but not hers etc. etc.


    I would frame that as an argument against proprietary, non-standard software.

    Share this post


    Link to post

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now
    ×