“The Portable Linux Apps project brings the ideal of “1 app, 1 file” to Linux. Applications are able to run on all major distributions irrespective of their packaging systems – everything the application needs to run is packaged up inside of it. There are no folders to extract, dependencies to install or commands to enter: “Just download, make executable, and run!”” A follow-up article describes how it works, and how to transform debian packages into AppImages. The packages don’t include libraries, so the system won’t need to update the same library in each individual app.
I’ve always been a fan of Apple’s self-contained app style, as it’s simply easier to deal with. Unzip to install, drag to the Trash to uninstall.
Would make life for non-linux types a little easier.
This is a great idea, I really hope some of the major distributions are paying attention.
It is not a good security. The package manager (or Apple App Store) take care or maintenance and updates. If this system become mainstream, it will be a mess. Just like it was once this was the preferred way of installing software. The stuff in /opt is let to rot until it get hacked. Having a single file is as bad.
It will also take a lot more space, like on windows and OSX.
What’s new about this? Klik has had the exact same thing for 5 or 6 years under Linux – if not longer. A single .cmg file that is an “app image” and contains everything to run the app – no install required. Double click the .cmg file and the app runs.
http://en.wikipedia.org/wiki/Klik_%28packaging_method%29
I guess the usual applies for Linux. We want 10 of everything.
And isn’t http://autopackage.org/ following the same context, too?
Looks like they replaced the old site with a new one with less information though.. :/
From their wiki:
Dependency Resolution
Currently, package maintainers are assumed to either:
a. Package all an applications dependencies inside its AppImage (excluding those already present in the intended base system)
OR
b. Provide users with instructions to resolve dependency issues.
vs OSAlert article
There are no folders to extract, dependencies to install or commands to enter:
The packages don’t include libraries
Either packages include all the libraries except for basic ones assumed to exist on every intended target distro and users really don’t have to install any OR packages don’t include libraries but users are required to deal with it themselves.
The two don’t apply at the same time
Personally I only see merit in this when packaging with libraries.
yeah, exactly what I wondered. It doesn’t make sense to say there are no dependencies to install, but the packages don’t come with libraries. It’s gotta be one or the other.
The huge thing that the NeXTSTEP/OSX style .app directories give you is that everything is packaged up in a single directory. Note, this DOES NOT PRECLUDE YOU FROM USING SHARED LIBRARIES!!!. All the OSX style .app directory does at a minimum is package the config / icons up into a single directory. This means that you do not have to copy / edit .desktop files, copy icons, create shortcuts, or any of that BS.
From what I’ve read, the portable linux apps project does something similar. I have not checked in a while, but I think the Linux Standards
Base specifies that both GTK/QT libraries are installed. This should be sufficient for many if not most apps out there, so there should be little need for static linking.
Hope this takes off, as installing apps (say a newer version than is shipped with the package manager) is my biggest sore spot with Linux.
Decades ago, when working with DOS apps directories, I remember thinking, “this is so much like Nextstep/OS-X.”
Gobolinux and other Linux distros already do this.
I have never had any problem with package updates and package managers. In fact, I find the package manager system speedier, safer and more convenient than the standard OS-X and Windows methods (and similar Linux methods).
Don’t forget, the Apple apps stores are just knock-offs of Linux package managers, except you have to pay.
Edited 2010-07-16 21:55 UTC
Package updates typically work just fine, I agree. However, the sore spot for many is that there’s no dead simple way to install newer versions of a program than the package manager offers. Take OpenOffice for example, if your distribution comes with 3.0 and you want 3.2, currently there are two things you can do on the package manager end:
1. Search for packages of the new OO version you want. If you’re dealing with a common package format, like deb or rpm as well as a common enough bit of software like OO, , these usually aren’t hard to find. However, dependencies then can become a problem, though thankfully dependency hell isn’t as common as it once was. Still, in OO’s case it requires downloading many packages, figuring out which ones you need, and installing them. Far more complicated than Windows or OS X. This becomes easier if someone is maintaining a repository for the software of course, but that’s comparatively few and not many common software packages are doing this and it’s still highly distribution dependent.
2. Upgrade your entire os to get the new software you want. In all honesty, this isn’t practical nor should it be, and it’s a ridiculous way of operating. This even Apple got right with the iPhone, in that app updates are separated from the os itself. A separation such as this in a Linux system would be invaluable, as then it would be easy to do application software updates separate from system updates.
Of course there are rolling release distributions such as Archlinux (which is what I use) but these generally aren’t suited to most computer users. They require somewhat regular updating, and there’s always a chance something will break if it hasn’t been tested as well as it should.
Considering no one in the Linux community will ever agree on a standard package format, self-contained apps might be the best way to deal with this. If everyone used deb, or rpm, or insert-package-name-here and followed a set list of system dependencies, then installing new software would be easy. As it stands now though, self-contained bundles or statically linked binaries seem to be the only chance of distributing something that will work for sure on 99% of the Linux install base.
Then you want a rolling-release distro, like Archlinux. There is no “upgrading,” there is just “updating” and it’s as painless as anything can possibly get:
sudo pacman -Syu
Of course there are rolling release distributions such as Archlinux (which is what I use) but these generally aren’t suited to most computer users. They require somewhat regular updating, and there’s always a chance something will break if it hasn’t been tested as well as it should.
Huh?
On Sidux, all one needs to do is:
# apt-get update
# apt-get install <package>
The first command updates the database for the repository, and the second command installs the most recent version of the package.
No need to upgrade the entire OS.
This same process can be duplicated in Sidux on a number of GUI front-ends.
Other distros that keep the latest versions of packages (such as Arch, Debian unstable/testing, Gentoo, etc.) have similar commands, and also offer this same functionality with a GUI package manager.
Edited 2010-07-17 04:50 UTC
On Sidux, all one needs to do is:
# apt-get update
# apt-get install <package>
The first command updates the database for the repository, and the second command installs the most recent version of the package.
As the parent you replied to rather clearly said: it’s difficult to install versions newer than what’s available in the repos! As such your apt-get won’t help you a bit.
Sidux is Debain Sid — latest versions. Plus one can use many other cutting edge Debian repos with Sidux. The other distros that I mentioned also put the latest versions in the repos. There are other distros that do likewise.
Furthermore, most of the Linux repo packages are modified and updated way more frequently than their Windows and OS-X counterparts.
Dude. You just don’t want to get it. How about someone who wants to install 3.0 and sidux has 3.2?
Package managers rock – but you loose flexibility in what you can install, period. My repositories don’t offer the last year of OO.o installations – only the latest. openSUSE and Ubuntu offer these user repositories, which might help, but if it doesn’t you’re out of luck.
Either way I agree with what was said before: nothing new here, move along – klik did that years and years ago and nobody cared then either.
For this to work properly you need to have a heavy base of libs installed – all of the Gnome and KDE libs by default at the very least (and all their dependencies includion optional ones). If you do it like that, yes, this works – generally speaking. You could define a stable API and ABI for it through LSB and only update it every 3 years, demand backwards compatibility. The Gnome and KDE communities would provide it, everyone else probably wouldn’t, so you’d quickly have to depend on outdated libs – and you’re screwed.
IOW it simply doesn’t work unless all libs provide EXCELLENT backwards compatibility and the ability to keep older versions installed next to the new ones.
Are you the Dell spokesman?
There are about a dozen different ways to do that within Sidux and within most Debian-based distros. Many of these methods use a GUI.
The same capabilities exist with other package managers used in other distros.
Hint: Use the distro that offers what you want.
Again, you are not necessarily “out of luck” if the package version that you want is not in your default repository. There are several easy ways to install the latest version or an earlier version of a package, with many distros.
So did DOS, Windows 3.1, Gobolinux, Zero Install, etc. By the way, none of these systems preclude the use of a package manager to retrieve/install their packages.
Yes. And such a scenario requires no more resources than the “self-contained” package systems.
However, you don’t really need the “optional” dependencies and you only need Gnome/KDE libs if you have applications that need those libs. So, in this sense, one could operate with fewer resources than those required for a system that is designed around completely self-contained packages.
“Dude,” when I used to multi-boot, I would run applications from the other distros (some compiled years before) located on other partitions, with very few problems.
Or you could avoid all this nonsense and just use Slackware, or better yet, LFS, and compile ALL your own dependencies before installing any software. That way you have everything you need and it only takes you a week or so of prep!
** Before I get flamed or modded into nonexistence, that was sarcasm. Go back and look at this thread and see how silly you all sound. **
Sure, and for me most of this ain;t a big issue either. But for grandpa, it is… and that’s who this one-click-stuff is for.
Yes, most of this is possible – it’s just too hard. Having one file or one folder with all of the needed stuff in there is simple enough for the common user. Anything above that – too difficult…
Anyway, I think it is a good idea. I would prefer to have the libraries included inside the package simply because then there would never be any dependency issues, ever. Or overwritten libraries, or 40 different libraries with almost the same name in /usr/lib and /usr/local/lib.
This could prove valuable at least to the people stuck with older versions of their distros.
This is much more like Portable Apps than Mac dmg’s. This should also be obvious from the name.
The whole discussion of:
“This is great, now all the dependencies (including libraries) are included in the .app directory!” VS. “This is horrible, now you have 17 versions of the same library and 17 times the bloat!”
seems like a fanatic flamewar like vi VS. emacs or Gnome VS. KDE to me imho.
Here is why.
Construct the “autopackage” or whatever you want to call it this way:
Upon being run the first time, it checks what major Linux branch you run (e.g. Debian based .deb, Redhat based .rpm, etc (LSB should help here, as lsb_release -icr outputs distro name, version, code name). Yes I realize that would encourage fewer distros with fewer unique package formats, and no I don’t think this is all that bad. If you have a distro that doesn’t fit, send the package developer info that says ” my distro has library x in location y, or my distro has .deb packages but the libraries are in the fedora default locations, and voila, if they use your input instead of ignore you, the next revision of the autopackage WILL work for you too.
If any of its dependencies (libraries or whatever) aren’t present in their default location (for that distro), maybe even pop up a question “where is this library in your system?” with a default answer already input by being piped from locate “name of file” or a hint for you to run this short command yourself and paste the result in the question dialog. Once successful, symbolic link to system supplied files. Done!
If it can’t be found anywhere, warn that it is so, and offer a choice:
A) Please install this from your package manager (supplying names of debian/redhat packages that contain this as hints), or
B) automatically wget said library and place it in the .app directory, with a brief warning (with a never show this warning again checkbox/command line switch) that “using the downloaded/”standalone” dependency may mean that this app is outdated and insecure, even when the rest of you system is properly up to date/secure.
And of course a preference item inside the app (toggle switch) between 1) “use built in dependencies” (more likely to run first time) VS.
2) “use system files for dependencies” (more secure and up to date).
No bloat or duplication for anyone that can resolve dependencies now, AND no “I can’t run this junk, its missing some obscure library (or I don’t know why it won’t run!)”. Everyone is happy!
Yes the initial packaging is longer and more complex(but the complexity is so structured I’d be surprised if it can’t be scripted away), but only the very first time! All updates, whether from version 3.0.1 to 3.02 or 1.0 to 5.6 only need the changed files, so now Delta packages start to become easier/more common, so now updating Open Office isn’t a 200 Mb download, its 25 Mb! So much more sane in places with slow/metered/capped/too expensive connections! Exactly the kind of places where open source can shine!
Imagine there is a library that is used in a lot of packages, just not so much that their system will assume it is there. Hence the library get packed with the apps.
Then there is a fatal security flaw unveiled in this library. The user will then have to update each application that uses the library instead of just a library packages…
Why not throw all rational thought away and just all link static instead of dynamic.
Edited 2010-07-17 12:24 UTC
Security flaws are rarely in libraries, and the entire updating system could be automated.
There is also a major security flaw with the shared library system in that an application cannot immediately patch itself. We’ve seen seen this plenty of times with Firefox where the Windows and OSX versions were patched faster.
I wouldn’t call the current system rational when:
1. Dependency issues can break programs.
2. Dependency issues can require an OS update just to run a newer version of a program.
3. Program files are allowed to be scattered across the system.
4. The entire system requires far more labor via package maintenance compared to systems where programs and libraries are independent.
5. It increases porting costs for ISVs
You know it was one thing to defend the status quo back in 2001 but when Linux has been at 1% since then I say it is time to give the reformers a chance. The status quo isn’t working.
You know who the status quo works for? Microsoft. When you work against the reformers you keep Linux exactly where Microsoft wants it to be.
There are two main problems that I see with this type of app installation. First as mentioned, you get multiple copies of the same library which becomes a security nightmare. Especially if you want to limit users actions. If they are allowed to load any library into their personal directories and have them work, there are all kinds of things they could do. Second, app folders seems like a single user system designed technology. How well does this work on multi user systems?
You also have the problem where major subsystems are being upgraded still. Sound, video, the kernel, they are all changing quickly. Some dependencies go well beyond minor version changes, requiring system updates anyway.