Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
sLydE

Which operating system do you use most?

Which operating system do you use most?  

100 members have voted

  1. 1. Which operating system do you use most?

    • Windows
      81
    • Mac OS X
      4
    • Linux
      15
    • Other
      0


Recommended Posts

MikeRS said:

You can also buy pre-installed Linux computers, sometimes even with OEM specific customization (eg, Dell does them)! Where's your argument now?


Maybe the situation is different somewhere else but here in Germany it is virtually impossible to find such offers - and if they can be found they can't compete with Windows systems of the same price.

Espi said:

Is there something I'm missing here? Because it seems like the only thing you get beyond home basic is crap you don't need: http://www.microsoft.com/windows/windows-vista/compare-editions/default.aspx


Most people never need these features but I have only seen the Basic edition preinstalled on low-budget systems. Most come with Home Premium although I have to admit that I never used any of these features. I even disabled Aero after the initial effect (that wasn't that great to begin with) wore off.


There is no point in buying Vista by itself. But it doesn't really hurt to get it with a new computer.


Indeed. Let's just hope that some game developers outside M$ are not stupid enough to develop DX10 only games...

Jodwin said:

He had only mentioned that he turned off UAC. Turning that off takes less than five minutes. You can't install Ubuntu in less than five minutes. You lose.



UAC is the only feature that really gets in the way but here's the things I changed in the Windows settings:

- disabled Aero
- disabled UAC
- set the start menu to 'classic' look.

After that it was basically XP with a nicer looking GUI.

That said, of course you can't setup a computer in 5 minutes. Depending on the amount of software you need to install it may still take an entire day until you get everything working as intended.

MikeRS said:

As for real features, Ultimate Edition is the only one that gives disk encryption itself, though you might as well save yourself $300 and just download TrueCrypt which can do the same thing, even on Windows XP or the "lower" editions of Vista.


Questions:
- Does anyone need disk encryption on a privately used PC only 2 people have access to?
- Would anyone who needs data protection ever use a program developed by a large company that may have ties to who knows? I sure wouldn't.

Bottom line: The more expensive Vista editions are indeed a waste of money. But buying Vista to upgrade an old computer is a waste of money no matter what.

Share this post


Link to post
Graf Zahl said:

Maybe the situation is different somewhere else but here in Germany it is virtually impossible to find such offers - and if they can be found they can't compete with Windows systems of the same price.

Trying seeing http://www.debian.org/distrib/pre-installed#de for some companies that offer them in Germany. I checked a couple of the sites, and the systems had rather old hardware, perhaps I just got unlucky; plus I don't speak or read German so it's hard to navigate the sites myself.

Questions:
- Does anyone need disk encryption on a privately used PC only 2 people have access to?
- Would anyone who needs data protection ever use a program developed by a large company that may have ties to who knows? I sure wouldn't.

- Assuming it's a husband-wife desktop computer type of thing, probably not. Disk encryption is generally used on laptops or other areas your computer and/or disk might be compromised and fall into the wrong hands. Whether it would benefit you really depends on your needs.
- Probably not, and that's why I suggested TrueCrypt. :)

Share this post


Link to post
MikeRS said:

Trying seeing http://www.debian.org/distrib/pre-installed#de for some companies that offer them in Germany. I checked a couple of the sites, and the systems had rather old hardware, perhaps I just got unlucky; plus I don't speak or read German so it's hard to navigate the sites myself.



These systems aren't bad but the main problem still is that these are sold by specialty dealers whose prices are higher than what I'd have to pay for something comparable I'd get off the shelf in the local electronics store for the same price.

Share this post


Link to post

Linux is boring. Let's go off-topic again.

DJ_Haruko said:

I've been using pen and paper, with IPoAC as my transport protocol.


This is actually a more efficient protocol than you might think. In between flights, you can use the same pigeons to rank web sites.

Share this post


Link to post

Jodwin: As stated above, he mentioned UAC and Aero, but also there's getting rid of all that crapware that every single new brand name computer comes with. Yes I've heard that one company or two allows you to get a computer without the crapware, for a cost.

On a slightly unrelated note, for those of us who refuse to buy brand name desktops for their generally meh quality of hardware or for whatever reason, Windows is always both more expensive and more time consuming.

Share this post


Link to post
John Smith said:

Jodwin: As stated above, he mentioned UAC and Aero, but also there's getting rid of all that crapware that every single new brand name computer comes with. Yes I've heard that one company or two allows you to get a computer without the crapware, for a cost.



It's not that hard to go to the software menu and uninstall everything you don't need and compared to the time that is needed to install the software you need, irrelevant.

Share this post


Link to post
John Smith said:

Jodwin: As stated above, he mentioned UAC and Aero, but also there's getting rid of all that crapware that every single new brand name computer comes with. Yes I've heard that one company or two allows you to get a computer without the crapware, for a cost.

Yeah, and Aero is even faster to disable than UAC. Still nowhere near to the time it takes to install Ubuntu (and I won't bother repeating what Graf just said about uninstalling software).

Simply put, the whole time argument against disabling Vista's "Big two features" can not be won.

You lose.

Share this post


Link to post

Yeah okay, I've done this, way more than either of you I'm sure, and no, it takes ages to get rid of crapware. Multiple forced reboots, tediously slow uninstallers, software that makes you uninstall it by component, etc etc. It takes at least an hour just to do a basic uninstall of all the shit that a new Dell XPS, for instance, comes with, and then to scrub it entirely from your system (removing remaining files, deleting registry entries) is even longer. I bet even if you did this like a hellbent fucking commando the entire process would still top 60 minutes, and I assure you I can install most distros of linux faster than that. Fuck that shit, I can do the what-should-be-mandatory re-install of OS X on a new MacBook faster than that, which is saying something.

Put plainly, buying a new Dell or Gateway or whatever takes longer to get to a non-shitty state with Windows than Linux, end of story.

Share this post


Link to post

John Smith said:
Jesus wtf I only posted this once

Linux is better since it can post three times when Windows can only post once.

Share this post


Link to post
John Smith said:

Yeah okay, I've done this, way more than either of you I'm sure, and no, it takes ages to get rid of crapware.

I would argue this to be true pre-2000/XP.

Multiple forced reboots

I would argue this to be true pre-2000/XP.

tediously slow uninstallers,


The only slow installers are the ones that actually uninstall everything.

software that makes you uninstall it by component

What software would that be? I'm curious.

It takes at least an hour just to do a basic uninstall of all the shit that a new Dell XPS, for instance, comes with,

As an owner of a M1310, Bullshit across the board. I had absolutely no issue with crapware except Trial Software, which took all of 5 minutes to uninstall.

and then to scrub it entirely from your system (removing remaining files, deleting registry entries)


Don't blame Windows on this. This is rot, and it happens to Linux, too. In fact, it happens in Linux faster, to the point that many users keep their home directory on a separate partition and wipe the main HD from time to time. There is no excuse for this, but this isn't the OS's fault. The amount of links and once-used objects add up quickly.

I bet even if you did this like a hellbent fucking commando the entire process would still top 60 minutes


Try fifteen.

and I assure you I can install most distros of linux faster than that.

It's not that hard to click "next" and enter some basic information and a product key.

Fuck that shit, I can do the what-should-be-mandatory re-install of OS X on a new MacBook faster than that, which is saying something.

No comment.

Put plainly, buying a new Dell or Gateway or whatever takes longer to get to a non-shitty state with Windows than Linux, end of story.


I just bought a Gateway in October, and I don't remember having any crapware on it at all. Ever. Maybe our definitions are different.

Share this post


Link to post
Csonicgo said:

I just bought a Gateway in October, and I don't remember having any crapware on it at all. Ever. Maybe our definitions are different.

I honestly don't trust Gateways.. I bought one a while ago and the crappy thing literally melted on the inside. I turned it on, it started to smoke, opened it up, and the motherboard was completely melted.

Share this post


Link to post
Craigs said:

I honestly don't trust Gateways.. I bought one a while ago and the crappy thing literally melted on the inside. I turned it on, it started to smoke, opened it up, and the motherboard was completely melted.


Has it gotten that bad? I remember when Gateway was like Dell in that you had to order one or go to a special place to get one, now they're everywhere and quality seems to be a coin-flip. Luckily the one I have now hasn't exploded yet.

Share this post


Link to post
Csonicgo said:

Don't blame Windows on this. This is rot, and it happens to Linux, too. In fact, it happens in Linux faster, to the point that many users keep their home directory on a separate partition and wipe the main HD from time to time. There is no excuse for this, but this isn't the OS's fault. The amount of links and once-used objects add up quickly.

Uh, it doesn't happen in Linux, and that's not why separate partitions are used.

In fact, it's quite uncommon to actually reinstall the operating system for any reason than switching to a new distribution (I'm sure Graf will try to counter this one, but I'll make the point that most users will just pick a distribution and stick with it without some very convincing reason to switch; it's the technical users that really switch distros).

Share this post


Link to post
MikeRS said:

Uh, it doesn't happen in Linux, and that's not why separate partitions are used.

So you're saying that the Linux file system forces applications to write some kind of a watermark to every file they create, so that when you uninstall that application Linux knows to do a full search of all your hard drives and delete every file that shares the program's watermark?

No? Well isn't it then up to the program developers to write their programs so that everything gets cleaned up on uninstall, and I'm 100 % certain that not all programs are written to be clean.

Share this post


Link to post

No to both questions (though you forgot the question mark on the second, "Well isn't it up to the program developers...").

It's up to package managers to track every file installed, or upon the user if you decide to install things manually from source (if things are really messy and they don't provide a "make uninstall" or equivalent, I'd recommend installing to /opt/program-name so that removing it just entails removing that directory); installing from source is very uncommon to most users, and really only happens for specialty programs that may not be necessarily in a distribution's repository of software (eg, the various Doom ports).

Share this post


Link to post

You have no idea how Linux apps are installed are you? You can't just install a Linux app anywhere, because the install predefines it to /usr/ (in which the sub category is chosen). Unless its user installed (which normally means you know where all the files are stored anyway), most Linux installers like apt-get know how to remove the requested app cleanly.

Share this post


Link to post
MikeRS said:

It's up to package managers to track every file installed,



And these are infallible?

Sorry, but I just won't believe that it is impossible that file garbage can be avoided completely.

Although not being able to install to an equivalent of Windows\System is certainly a huge advantage. It's unbelievable that some applications still try that. :(

Share this post


Link to post
Jodwin said:

So you're saying that the Linux file system forces applications to write some kind of a watermark to every file they create, so that when you uninstall that application Linux knows to do a full search of all your hard drives and delete every file that shares the program's watermark?

No? Well isn't it then up to the program developers to write their programs so that everything gets cleaned up on uninstall, and I'm 100 % certain that not all programs are written to be clean.

If you've only ever used Windows, it's probably difficult to understand. On Windows, to install a program you run an "installer" that copies some files onto your hard drive and installs registry keys. If you're lucky, there's an "uninstaller" that does the opposite, although they usually don't work properly and you end up with cruft building up over time.

On Linux, every configuration file, program file and library is tracked by the package management system. Everything belongs to a "package", which are individual pieces of software that can be installed or uninstalled. Because there is a database of all the files that belong to a package, you can be sure that when you uninstall it, there will be nothing left behind. Systems like 'dpkg', which is the package manager used by Debian and Ubuntu, track the dependencies between packages, and can identify the packages that you no longer need and automatically remove them.

You can do the Windows thing as well, of course, and just copy files in. However, there are specific places in the filesystem that are allocated for this purpose, so they're kept separate from your main OS.

Graf Zahl said:

And these are infallible?

Sorry, but I just won't believe that it is impossible that file garbage can be avoided completely.

It's not completely infallible, because you can devise packages that run scripts when they are installed, and the scripts can potentially leave cruft behind. In practise though, it's rare. The majority of files forming the "program itself" that is being installed as part of the package are tracked automatically and can always be removed.

Share this post


Link to post
fraggle said:

It's not completely infallible, because you can devise packages that run scripts when they are installed, and the scripts can potentially leave cruft behind. In practise though, it's rare. The majority of files forming the "program itself" that is being installed as part of the package are tracked automatically and can always be removed.

This is what I was (mostly) talking about. Lets say I install a game to /usr/gameX/ and the game has code which, without asking anything, installs save games, temp files and other stuff to /usr/random/temp/. These completely new files, they aren't being tracked by your package manager, or are they?

Share this post


Link to post

post-installation scripts are usually caught by package maintainers and handled appropriately. (In fact, the package manager will need to run the script itself, so it's pretty silly to see a case where it's not handled.) Any extra configuration or data files they create are uninstalled per normal.

I've never seen an official distribution repository failing to do this basic task, though it's conceivable for cruft to be left behind by incompetent package maintainers, and this especially happens in unofficial repositories. These repositories usually promise to make the distro "easy" for codecs that are illegal to distribute or other such trivial things, and most of them fuck around and conflict with system files they really shouldn't be testing, uninstalling packages or upgrading to a new distribution version often causes problems with these. Nobody sane ever recommends the use of them; you really need to be very careful when adding unofficial repositories. Using them pretty much makes your package management as useful as MS Windows'.

(This isn't necessarily saying that all unofficial repositories are always bad, but you should be very careful when adding them. The WineHQ repository and VirtualBox repositories, for example, have very specific uses and are generally safe. Random Joe of the Week's "Easy Multimedia!" repository is generally not safe.)

Share this post


Link to post
Jodwin said:

This is what I was (mostly) talking about. Lets say I install a game to /usr/gameX/ and the game has code which, without asking anything, installs save games, temp files and other stuff to /usr/random/temp/. These completely new files, they aren't being tracked by your package manager, or are they?

This is wrong on many levels:
1. Untracked programs should never be in /usr directly. /usr/local is for such this, although its filesystem structure generally follows the same standard as /usr. things that would like their own contained directory should probably go in /opt/GameX, this is (mostly) what /opt is for. (Some particularly old games like to do /usr/local/games/GameX, though this is fairly uncommon now, but it's really not any harder than the /opt/GameX thing)
2. Save games and other such things should always go to $HOME/.GameX or else it is a serious design flaw and the publisher/developers are quite likely to get many angry emails.
3. If you don't install from a package, they don't get tracked by the package manager; not a hard concept to understand. If it does have a package, the package manager will surely understand all the files/directories installed in /usr/bin, /usr/games, /usr/lib, /usr/share, etc.

Essentially, you're looking at the supposed problem from a viewpoint that doesn't even exist. :)

Share this post


Link to post
MikeRS said:

2. Save games and other such things should always go to $HOME/.GameX or else it is a serious design flaw and the publisher/developers are quite likely to get many angry emails.
3. If you don't install from a package, they don't get tracked by the package manager; not a hard concept to understand. If it does have a package, the package manager will surely understand all the files/directories installed in /usr/bin, /usr/games, /usr/lib, /usr/share, etc.

Essentially, you're looking at the supposed problem from a viewpoint that doesn't even exist. :)

1. Your point 2 and the last sentence in your post are quite contradictory. My exact point is that whether the uninstaller uninstalls all the crap that it should install is, in the end, up to the program creators. If the creator for some reason whatsoever wants to create "shit" that won't clean up properly there's nothing the OS can do about it, be it Linux or Windows.

2. On the third, so if the game (or some other program) creates new files during runtime, will the package manager keep track of them? Or will it keep track of them only if they are created in a particular directory?

Share this post


Link to post
Jodwin said:

1. Your point 2 and the last sentence in your post are quite contradictory. My exact point is that whether the uninstaller uninstalls all the crap that it should install is, in the end, up to the program creators. If the creator for some reason whatsoever wants to create "shit" that won't clean up properly there's nothing the OS can do about it, be it Linux or Windows.

How is it contradictory? You're proposing a hypothetical situation that never has happened. It's possible, but it requires that the user has write permissions to all those places (which no sane distro ever does by default (I say "sane distro" because there might be some "1337 Haxorz T33nager" distro that gives you root per default, but they're obviously not sane)), and the program to be extremely poorly designed. Plus if you're really running such a hazardous program, you can always put it in a chroot so that it can never write outside of a particular directory short of a kernel security flaw (in fact chroots are common for things like web servers which sometimes allow people to remotely manipulate the filesystem via security flaw themselves). So yes, the operating system can do something about this.

2. On the third, so if the game (or some other program) creates new files during runtime, will the package manager keep track of them? Or will it keep track of them only if they are created in a particular directory?

See my earlier paragraph. Why would a program ever write new files anywhere besides $HOME and /tmp? Not to mention it requires the user to have write permissions in your concerned directories...

Share this post


Link to post
Jodwin said:

1. Your point 2 and the last sentence in your post are quite contradictory. My exact point is that whether the uninstaller uninstalls all the crap that it should install is, in the end, up to the program creators. If the creator for some reason whatsoever wants to create "shit" that won't clean up properly there's nothing the OS can do about it, be it Linux or Windows.

2. On the third, so if the game (or some other program) creates new files during runtime, will the package manager keep track of them? Or will it keep track of them only if they are created in a particular directory?



Don't you get it? Linux is perfect!

Why is it perfect? Because all software created for it acts in a sane and sensible manner! (at least if I were to believe the people that say it's impossible to leave files behind.)

Well, I'd say this is a pretty big assumption. The problem is, people are not perfect. People make errors. Such errors may have consequences.

fraggle's technical explanation is the only post that really helps form an opinion. It's good to see that at least one person is capable of providing information in a way that is understandable to non-Linux users so they can form an opinion.

That said, there is no software that can not be broken, no safeguard that can not be circumvented (as everyone seems to agree to.)

So, while it may be hard hard to create cruft on Linux, it is basically agreed upon that it's not impossible. Why do we need to continue this argument?


Also what's up with different file formats for different package managers? Is this really necessary that each one needs to do its own thing? Where's the compatibility? ;)

Share this post


Link to post
Graf Zahl said:

Also what's up with different file formats for different package managers? Is this really necessary that each one needs to do its own thing? Where's the compatibility? ;)

Essentially, there are two different package formats: dpkg (.deb) and rpm (.rpm). If you search then you'll find other minor distributions with their own package formats, but those are the main two. There are some minor differences in the feature sets provided by the two but they essentially do the same thing. Each distribution tends to settle on a single package format: Debian-based systems (including Ubuntu) use dpkg, while Fedora/Red hat-based systems use RPM.

A more serious problem is that of differences between distributions, rather than package formats, however. Packages are essentially just archive files containing a collection of files to be installed; no big compatibility issue there. The bigger problem is that different distributions can have different versions of packages.

For example, suppose you install a program named "foobar" and it depends on a library named "libfoo". The packages track this dependency, and will refuse to install foobar unless you have libfoo installed as well. Before you say this sounds like a nightmare waiting to happen, fortunately, Linux package systems are clever enough that if you try to install "foobar", it will automatically calculate all the dependencies you need, and install libfoo as well.

This all works fine ... in isolation. Within a particular distribution, you can install packages and remove them, and it all works beautifully and elegantly. However, if you want to mix and match packages from different distributions, things get trickier.

Suppose you have the Fedora .rpm version of "foobar", and you want to install it on your Ubuntu system. In principle, it should be easy to just convert it to a .deb and install it, right? In reality, it's slightly trickier, because one of these things could happen:

  • The package names can be different across distributions: eg. libfoo could be called "foo-libs" on one distribution and "libfoo" on another. The same thing can happen with version numbers.
  • The foobar package could have been made for a version of Fedora released 6 months after your Ubuntu distribution was released, so it may depend on a slightly newer version that is not available on your distribution.
  • The two might be incompatible because libfoo was compiled differently on the two distributions (though this probably isn't too likely)
It is possible, in this situation to just ignore the dependencies and install it anyway. To be honest, all Linux distributions use essentially the same set of software anyway, so this is likely to work. However, this means that the nice automatic dependency installation system no longer works! D'oh!

To be honest, I'm inclined to believe that the Linux environment is intrinsically hostile to proprietary software just because of this situation. It's really not very pretty. Different companies distributing Linux versions of their software take different approaches: some static-compile everything, removing the dependency problem. Some, like VMware, provide their own versions of all of the libraries they use. Most simply avoid providing proper packages altogether, although I notice that Skype actually provides packages for half a dozen different distributions.

Having said all of this, I don't actually find any of this much of a problem myself, because I don't really use much proprietary software anyway. However, although the Windows system is brain-dead and has obvious problems, the Linux situation obviously has separate problems of its own.

Jodwin said:

This is what I was (mostly) talking about. Lets say I install a game to /usr/gameX/ and the game has code which, without asking anything, installs save games, temp files and other stuff to /usr/random/temp/. These completely new files, they aren't being tracked by your package manager, or are they?

Well, there are several things to explain here. Firstly, there is a distinction between program files, temporary files and "document" files (for want of a better phrase, this is what savegames would come under). The package manager tracks the program files and any system configuration files: for example, if you installed a web server, the web server and all its configuration files are tracked and automatically removed. If you had a program that normal users run, each user can have separate configuration, so they would have a configuration file in their home directory for that program. This would not be removed. For temporary files, there's a location in the file system called /tmp, which is cleaned out automatically every time the system boots.

So you can potentially have cruft left over in the form of configuration files left in your home directory, but it depends on the type of program (in general, it might if it's some kind of interactive program).

Share this post


Link to post
fraggle said:

For temporary files, there's a location in the file system called /tmp, which is cleaned out automatically every time the system boots.

Either that, or files older than 1-6 months (depending on the system configuration), though most desktop users are unlikely to have their system up long enough to notice that feature; this lack of noticing on their own systems sometimes has the tragic and humorous effect of users on corporate servers storing files in /tmp for very long periods of time, and then finding their important documents deleted after not being accessed for a while. (This is a failing of the user not noticing that important documents *do not* belong in a temporary storage area, rather than any OS design.)

Share this post


Link to post

Thanks for that informative post.

Finally someone who can explain it in a manner that is both understandable and doesn't try to talk down the problems that exist.

In particular your statement about proprietary software is actually a big concern. This is probably the major issue that keeps Linux from becoming mainstream. If developers just could create an installation package and don't have to worry about such things I'm sure that commercial software for Linux may become easier to obtain. (

I'm just wondering: On Windows most applications circumvent these dependency issues by having all required dependencies part of the installation itself, especially for things that don't need to be installed in the system (like most DLLs) Can't this be done under Linux as well in cases where dependency issues can't be afforded? Yes, it would certainly increase file size and probably memory usage but sometimes that seems to be the lesser evil compared to having software that may not install properly.

Share this post


Link to post
Graf Zahl said:

In particular your statement about proprietary software is actually a big concern. This is probably the major issue that keeps Linux from becoming mainstream. If developers just could create an installation package and don't have to worry about such things I'm sure that commercial software for Linux may become easier to obtain.

I tend to agree, although as I mentioned in my previous post, for me it isn't really a concern because almost everything on my Linux install is open source anyway. The only thing that I really "miss" is games - I usually boot into Windows XP if I want to play modern games. I think Wine is supposed to be quite good at that sort of thing nowadays though.

The really silly thing about the incompatibility thing is that a lot of the core stuff has pretty good binary backwards compatibility anyway. The kernel system call API hasn't changed in years, the GNU C library will run programs compiled against it 10 years ago, the X11 protocol hasn't changed much since it was created 20 years ago. I know that the Gnome project puts a lot of effort into maintaining backwards compatible interfaces, and the same is true of Qt, so even cross-distro GUI applications can probably be made. It might just be that someone needs to come along and simplify the whole process of releasing Linux binaries.

I'm just wondering: On Windows most applications circumvent these dependency issues by having all required dependencies part of the installation itself, especially for things that don't need to be installed in the system (like most DLLs) Can't this be done under Linux as well in cases where dependency issues can't be afforded? Yes, it would certainly increase file size and probably memory usage but sometimes that seems to be the lesser evil compared to having software that may not install properly.

Well, I think I mentioned in my previous post that this is what VMware does. Different companies seem to take different approaches to the problem. It really needs to be fixed properly, though.

Share this post


Link to post

I feel the need to thank you for your posts too, fraggle. Rarely does one encounter informative and clear posts about the way Linux works, usually all one gets is a wall of buzzwords-laden condescension about how wonderful Linux is and how totally fucking wrong Windows is on all levels.

Heh, and luckily none opened yet those two can of worms named "Drivers" and "Filesystem fragmentation" yet... oops ;-)

Share this post


Link to post

I don't really see a whole lot to fix on the side of proprietary software, it's mostly them that are refusing to go the standard routes. Take VirtualBox, for example, the proprietary version provides its own repository that neatly integrates with your system; it's easy to install and you get software updates to it the same way you get software updates to the rest (or most) of your software. VMware, on the other hand, forces the user to perform some script which does voodoo and if you don't like to read perl scripts, you can only hope that its vmware_uninstall.pl program actually removes everything the installer added. VMware also forces the user to perform updates and such manually.

It's really not fundamentally different from the distribution methods available to open source applications, of which many of those also ignore the whole packaging deal as well (eg, Chocolate Doom).(*)

(*) the checkinstall program is actually available to attempt to make packages from source builds the quick and dirty way, although it's not always successful. It will work on autotools projects 99% of the time (I've seen one autotools project that checkinstall fails at), and it usually tries to capture install paths and things at other non-standard build methods, but its failure rate is higher there, and it will never work if the program doesn't support "make install" at all (eg, ZDoom).

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×