After two and a half years of work, autopackage 1.0 has finally escaped into the wild. It has a fundamentally new design, and offers an alternative system of software distribution and management on Linux. This article will talk about what this means for the Linux community, and what new directions and possibilities it opens up. It’ll talk about problems remaining to be solved, and finally it will propose solutions for them. If you just want to see what autopackage is like, check out the screenshots or the Flash demo, available from the website.
What is autopackage?
At heart, it’s about allowing developers
to provide binary packages that can be used by every Linux user,
regardless of what distribution or desktop they run. While not
perfect, its success rate is already high and will become higher
in future. Though young, there are already autopackages of Gaim, Inkscape and
AbiWord
. It is also being used by much smaller projects such as the
GtkQt engine or Lincity which otherwise
would have no packages for most distributions, rendering them
difficult and awkward to install for many users.
It has several interesting features apart from working on any
distribution: it understands dependencies and
can resolve them. It supports multiple frontends,
with both textual and GUI frontends coming out of the box. It
ships with a manager application that lets you uninstall
3rd party software installed using autopackage (and in future,
this will develop into a generic tool that works for all
packages). Most importantly, it’s been designed with usability in
mind from the ground up.
“What idiots, a new package format is the last thing we need!”
To do things like support dependency resolution without depending
on particular distributions or package managers, a new dependency
model had to be devised and implemented. To provide binaries that
would run reliably on a rainbow of machines, new tools such
as apbuild and relaytool were written. To provide
the flexibility needed to deal with a wide range of systems, a
completely script/API based approach was used. To provide an
aesthetically pleasing experience GTK+ and Qt frontends were
developed. And finally, to make it simple even for non-technical
users, the ability to bootstrap the whole thing just by running
the packages themselves was added. To meet these requirements it
would not have been possible to adapt existing formats.
There was an additional, psychological reason. By providing a new
format, users who have been failed by the existing system have a
concrete feature request to make of the developers – rather than
being limited to vague expressions of dissatisfaction, users can
ask developers for something specific to help them. As developers
learn how to build autopackages, we can show them how to make
their software easier to install by evaluating dependencies for
stability, penetration (how many systems it’s installed on) and so
on. We can also teach them how to use programs like relaytool to relax
dependencies. They can then begin to improve their software to be
easier to install, for ease of installation – like usability – is
not something that can be slapped on in five minutes. It must be
considered while the software is built.
What’s wrong with depsolvers like apt?
Now apt and friends are fine tools for sure, but they do not solve
the developers problem of how to get the latest version of their
program to their users now, and not in six months time when
it finally gets packaged and integrated into the repositories of
all the distributions out there. The apt model of centralised
repositories is the polar opposite of the decentralised model of
software distribution as used by Windows, MacOS X and BeOS. What
makes sense for the packages comprising the operating system
itself doesn’t make much sense for third party applications. Even
if it was possible to provide an apt repository in which
everything was packaged and everything was up to date, the
usability problems would be crippling. Synaptic offers
unmanageable lists of choice and solutions such as
gnome-app-install simply lessen the problem but do not eliminate
it. Even search-oriented rather than list-oriented interfaces have
problems: no matter how smart your searching is, you’ll never beat
Google. Other problems, which you can read about in the FAQ, slam
the nails into the coffin of this model.
Moving to a decentralised model for distributing applications
raises fundamental questions about the structure and design of
Linux. It’s no longer enough to guarantee source code portability
– instead, binary portability must be provided too. It’s no longer
enough to check for optional libraries in a configure script and
compile the code to use them out when missing, instead the
libraries must be linked to at runtime and the fallback path used
if the library is missing. It’s no longer enough to have to guess
what is on your users systems: instead a large and solid platform
is required on which developers can build.
During the development of autopackage, we have considered and
addressed many of these problems. Binary portability problems have
been tackled with an easy to use GCC wrapper, called apbuild. The
POSIX dlopen interface is awkward to work with, so relaytool was
written to make weak linkage a snap. It makes it easy to fall
back to the old GTK+ file chooser if the new one is not available,
or disable spell checking if gtkspell isn’t installed. Finally
binreloc is a toolkit for making programs installable to any
prefix at runtime: this allows users without root to install
software to their home directory, and administrators can install
programs to network mounts and additional hard disks without
needing union mounts or LVM. It’s a rule that autopackages must be
relocatable.
Looking towards the future
There are lots of ideas and plans for what happens now. Probably
the most important is native package manager integration. Right
now, autopackages integrate well with a variety of desktop
environments. However, it doesn’t integrate much with the
underlying package management technology. In future, it will do
so.
That means registering with the RPM database when a package is
installed so you can list its files and metadata, so it can
fulfil dependencies for other RPMs and so it’s possible to
smoothly upgrade a package from RPM to an autopackage or
vice-versa. It means using apt-get, yum or emerge to resolve
dependencies as well as autopackages own built-in dep
resolution. Finally, it means a way to put autopackages inside apt
repositories to aid deployment on managed networks.
Dependency hell is a difficult problem to solve, as you may have
guessed by the fact that it still plagues Linux as late as
2005. Nothing in the design of autopackage stops a developer
depending on extremely new, rare or unstable libraries – the
primary cause. While autopackages check the system directly for
what they need (so avoiding the problem of inconsistent metadata
and package managers that don’t recognise source installs) if you
actually don’t have the necessary code then the install will
fail. To solve this, it is essential to provide developers with a
broad base of functionality that can be depended upon with only
one expressed dependency: a base set
or platform.
The user interface provided by both autopackage and traditional
package managers isn’t best of breed. What should be seamless and
transparent is not: the user is still expected to understand the
distinction between a program and a package, and has to comprehend
the purpose of installation. This means users need to think about
non-obvious questions “Do I keep the package I downloaded around?
Do I still need it after installing it? Can I send it to other
people?”. Drag and drop is not supported. Support for upgrades and
uninstallation is crude and not integrated with the desktop. Many
of these issues affect other platforms like Windows and MacOS X
too, but we should aim high. Solving this means integrating
package management with the desktop so applications become
first-class objects that the user can manipulate just like
documents. The concept of “installation” should fade away and
eventually disappear thanks to seamless desktop integration not
only with autopackage but with all package managers.
Meeting these goals head on will take time, effort and
dedication. At the end, we should have a better system in every
way: more powerful for developers, more flexible for
administrators and simpler for end users. We can raise the bar. We should raise the bar. Let’s get started.
Concise and to the point, how all OSAlert articles should be written. Great work!
I really hope the big distros embrace this
I really like the ideas behind autopackage and it sure addresses an area where Linux is clearly lacking.
However, from following the reaction to autopackage by some debian and hoary developers, that were extremely negative, to put it mildly, I’m really worried about the chances of this project getting adopted by existing distros.
So I’d really like to know if there have been more positive reactions from other distributions, or if maybe there is even a distribution that activly helps in the development of autopackage?
Mike Hearn recently posted about it in a comment (http://www.licquia.org/archives/2005/03/27/autopackage-considered-h…) to Jeff Licquia’s blog. Odd that he didn’t think it worthy of a mention in the article, but anyway, here’s what he said:
“I’m afraid RPM integration ranks as more important than dpkg integration though. So it may be some time before autopackage registers with Debian. Possibly it won’t happen at all unless somebody steps up to write the patch (whereas I’ll do RPM integration myself if need be).”
whether or not a big distro picks up this thing is irrelevant, the users and developers are the ones who can benefit
http://kitenet.net/~joey/blog/entry/autopackage_designed_by_monkeys…
for a bit of reality to the hype.
And http://www.licquia.org/archives/2005/03/27/autopackage-considered-h… too, while you’re at it. Read the comments in the last one too.
Where this really shines is for proprietary programs, say Doom 3, Oracle, Photoshop or some other big-name program ported to Linux where the idea of releasing the sourcecode is totally out.
I release the source for my linux app, but I’d like to have a binary that ordinary people can just install and run. I was looking at autopackage last time I was doing a release, but it wasn’t mature enough at the time.
Loki’s installer is a good example of a similar project (it didn’t seem to support console-only installs.. AutoPackage has multiple frontends).
That’s how I see it, anyway.
With my downloading of Inkscape I learned of two cool apps: Inkscape of course and autopackage. Autopackage did literally install itself snd then the desired package. Since then I always loved aautopackage, because it was so easy and I had not to worry about any dependencies etc.
If Autopackage will be capable of simple click and run from any website this will be a huge step forward for any linux-distro. Apt is fine, but when I want the bleeding edge Inkscape, I would just have to go their website and drag it on my desktop. What could be easier?
I never liked the rpm-way, unless you have apt4rpm but then you could also use any Debian-Distro with less trouble. But for real independence, autopackage is the way to go.
I sell Linux-distros almost daily. But every now and then a customer tells me he thinks it is too hard to install things. Autopackage would be a top selling argument. “You want app XYZ? Just go to their website and drag the app to your desktop and you're done.” This should be the goal of any distro that claims to be userfriendly, and it should be the goal of autopackage.
“However, from following the reaction to autopackage by some debian and hoary developers, that were extremely negative, to put it mildly, I’m really worried about the chances of this project getting adopted by existing distros”
Don’t worry. As Mike said: “autopackage is designed to survive even if everybody hates us”
Re: May also want to read…
“for a bit of reality to the hype.”
We read that blog by joey. Everything he mentioned is by design. Autopackage is fundamentally different from RPMs/DEBs. One shouldn’t even attempt to convert autopackages to RPM/DEB, or vice versa. While RPMs/DEBs are big archives with metadata (oversimplified of course), autopackages are archives with metadata and scripts. Because of the script-based nature, this allows packages to be flexible enough to cope with all the differences between distributions. That is what autopackage is designed for: to be able to install on as many distributions as possible.
Too bad joey’s blog doesn’t have a comment function.
I really like the idea of AP. I think it’s long over due. But my main concern is security. Does this make it easier for malware/spyware/adware to creep onto your system? One of the greatest “selling” points of Linux (or even OSX) is that the platforms are virtually (are completely?) free of that kind of B.S.
Distro adoption is irrelevant. Autopackage is NOT meant for use by distros, it’s only for third party developers.
As for the negative reactions, I’m rather puzzled about them. I mean, come on, FLOSS is all about choice, isn’t it? So the authors of Autopackage are actually widening our choice, not narrowing it. If you think Autopackage is redundant, which it isn’t, then you’re free to not use it. And yet some people complain and bash it! I’ve even seen the comments stating that everything making installation easier for users is inherently bad… Isn’t it dumb?
Joeys blog entry seems to be mostly about how he can’t make Alien parse .package files, which is correct and by design. They are very different to “traditional” package formats, as that was the easiest way to make them deal with all the random stuff they have to deal with.
If you want the payload of a package without running anything except the stub (which you can read using less), run the .package file with the -x switch. This will extract it to the current directory. If you want to examine the internals of the package including the scripts and metadata, use the -d switch.
You can also do what Joey did and extract the numbers from the stub directly, however this is not recommended for obvious reasons. If you are of the tin-foil hat variety feel free to read the stub code carefully to ensure it’ll do what I say it will, then run it.
Making the extraction automatic without code execution wouldn’t be too hard, it just means moving the offsets into the stable/supported metadata comments at the top. But, as a feature it’s pretty much at the bottom of my todo list as it’s basically useless (unless you’re Joey Hess).
I really couldn’t care less whether Alien can support it or not, to be frank. We have enough to do without dealing with people who try and convert autopackages to some other format and then wonder why it doesn’t work. Integration with dpkg and other problems Alien solves will come with time through direct manipulation of the systems databases (though that may involve construction of a fake package on the fly). However it’d be a part of the install process supervised by the rest of the runtime, not as a thing the user can do at will.
Honestly, those comments were really shallow.
The first one says how bad it is that Autopackage doesn’t store path information of files, and that they provide no garantee about compatibility.
Well, if you think Autopackage should store path information is because you still don’t know very well what Autopackage is. In other words: not storing path info is a big plus, as it allows you to install as simple user, instead of root all the time.
As for the compatibility, the guy don’t know what he talks about. The Autopackage team has compatilibity as the No 1 feature.
Now the second link, where the guy talks about how bad it is that Autopackage installs to /usr. First of all: you can specify where you want to install it; you just need to pass an option on command line. And second: the Autopackage guys already said they’re planning on rpm/dpkg integration, so instead of complaining it would be better if someone would actually start doing stuff.
Victor.
“Distro adoption is irrelevant. Autopackage is NOT meant for use by distros, it’s only for third party developers.”
Sure, but distos helping to make it work with their packaging system sure would be a benefit.
And it surely will not benefit autopackage when devs start to discourage users to use autopackage or for example refuse to accept bug reports (not about packages installed by autopackage of course, but in general) by people who use autopackage. And judging by the extremely negative reactions by some devs I’m quite certain things like this will happen if more people start to use autopackage.
I’m a slackware user, and usually compile apps from source then do the makepkg/installpkg thing and install them that way. Would there be any benefit to me compiling application source code, then using autopackage to install it instead of the slackware tools?
Just tried Autopackage with some packages provided at their homepage and in my experience it feels very good from a user POV. I did not install it as root but instead into $HOME/.local but anyway it integrated well into GNOME. I think it is a very useful packaging tool and I will definitely consider it when packaging my own stuff. Keep going (RPM integration is the way to go)!
Well, it doesn’t cause any more security problems than the MacOS X appfolders/pkg system does, look at it that way.
Seriously, I hate spyware just as much as the next guy. The spyware issue is raised so much it’s in the FAQ. Possible solutions are discussed there. It’s also pointed out that apt doesn’t solve spyware/malware problems at all.
So. If you look at how malware gets into Windows systems, there are several entry points:
* Internet Explorer exploits. There have been a variety of these, as well as variants like exploits in the Sun JVM. These are the so-called “drive by downloads”
* Trojans in apps like Kazaa
* Crap that spreads via buffer overflows and such (phatbot et al)
Briefly, the first and the last are just bugs and design problems we have to watch out for.
The second is generally what people are worried about with autopackage. However this clearly makes no sense. Apt repositories almost universally do *not* contain proprietary or commercial software, in fact it’s usually illegal for them to do so as most commercial software forbids repackaging. So you can’t just include all software except “bad” software, because generally distro repositories classify anything that’s non-free as “bad” and excluded.
But this is not the same definition of bad that end users are using. The end result is that users will learn that anything commercial must be installed outside of their distributions official systems, and therefore there is no reason for them to suspect it would be an issue (as not all commercial software includes spyware, far from it).
Does this thing let me put out source in it; so people can easily compile source? Or is it binary only?
much as i love SUSE, it exists in a state of blissful timelessness because i am largely ignorant of how to make changes, and mostly content to remain so.
i hack around with my XP box all the time, tailoring it to the way i want it to be, because it’s easy. i am not willing to learn some command line garbage just for a bit of customisation, but i do like a custom system so what is the solution? run WinXP/64.
for this reason things like AP are a godsend.
There is no point in creating your own autopackages just to install them. But, installing autopackages would be an alternative to compiling from source if they exist.
well…
Basicaly thats what linux needs, and with a tool as autopackage it will be easier for a commercial developer to make apps (games, cad programs, anything… actualy) for linux… We all will benefit since the developer won’t be able to complaing about linux’s aparent fracmentation since autopackage will allow their apps work on most distros without having to worry about it the way it happens now.
Linux is here to stay, and a viable platform for comercial apps… aside from the bunch of gread free apps we all have accesible now this could help make linux more accesible, and in the eyes of corporations thats what counts…
MS has fewer and fewer advantages to other alternative even more… and its disadvantages outwaigth its advantages… Autopakage should help us greadly if comercial vendors use it…
The only benefit I could even begin to imagine doing all that work for is the ability for your application to be “uninstalled” easily via Autopackage’s control panel.
I don’t like point and click. Windows installation method is utter bullshit compared to Portage. Autopackage should be useful for some situations though and I’m glad it exists for that reason.
This has been one of my gripes now about Linux, i like many things, but I hated that I could not get everything to install right or at all. I kept asking why can’t somebody just invent an installer that works on all linux distros ( so that it doesn’t matter which one I chose ) that allows me to easily install whatever the heck I want without having to figure out how to build every damn thing I wanted to install.
It’s even been recommended numerous times that you just stick with that your distro gives you which only infuriated me even more. With Autopackage nobody can bitch anymore that you can’t install something from the terminal or the GUI and it just works.
I seriously hope this gets heavily adopted by most distros, it’s the only thing holding me back from reccomending linux where it could actually help people.
First, Slackware beeing a minority distro with a fairly small number of apps included, compared to eg. Debian, AP has the potential to widely increase the amount of available packages.
Second, AP is “compatible” with the two dominant ways of installing packages on slackware, source and pkgs. It’s compatible with source installs bacause it checks dependencies against the filesystem directly, and not against some package db (if I understand correctly the db is only used for uninstallation). It’s compatible with slack pkgs cause they dont do any depchecking. All in all AP seems a perfect match for slackware. More so than eg Debian or RPM distros whose current systems aren’t realy “compatible” with source installs or AP.
Third, AP has the potential to dramatically reduce the duplicated packaging effort currently going on in the Linux community.
Bring it on!
I’d just like to point out that their flash movie is obviously a slackware system (running dropline), so perhaps the similarities to Slacks packaging system aren’t purely coincidental.
I love the idea of AP so much, that I contacted Adobe about it. So hopefully they release an autopackage for Acrobat and future Adobe Linux programs. I recommend you do the same if you like AP.
I also recommend that you urge Filezilla’s developer about using AP. He was deciding whether he should or not for the linux version of FileZilla 3.
Live by the Autopackage and die by the Autopackage!
Hopefully developers will pick up on using Autopackage, because now I don’t wanna go back to crazy command-line mojo anymore.
Using AP to do a complete kernel update, say from 2.4.x to 2.6.x… Now that would be cool. Just grab the package, install it and poof! New kernel on reboot.
Sweet.
Question, how does it deal with dependancies? If I want to only install stuff in User space instead of Root (which would keep things more secure on a system level) will I end up with dependancy issues?
Will it also contain all of the install into a single folder in the directory it is pointed to or will it throw stuff all over the file directories again leading to priveledges issues?
If this is as elegant as OS-X installer or installing software in BeOS then I’m all for it but if it throw crap everywhere alla Windows then I’ll stick with my distros package manager.
Um, building a kernel is possibly the thing autopackage is _least_ suited to. Go read up some more.
There aren’t any similarities between AP and slack packages, really. They just happen to complement each other nicely.
If an AP will be contained in a dir or “spread out” is very much up to the individual dev/packager. Not something that AP enforces as far as I understand.
Imagine if the N distros out there adopted one packaging format…image the (literally) years wasted (in aggregate) we would get back…distro writers could actually focus on real features instead of just spending hours trying to shoehorn OO.org or KDE into their package format…
Don’t see it happening though, For RedHat etc it would be eating too much crow to drop RPM. Debian people will die clutching .debs…etc. This situation will get worse until some of the distro writers move on to other pursuits.
if you like to have one, vote for this issue
http://qa.openoffice.org/issues/show_bug.cgi?id=46333
Tried it, installed Inkscape first (which installed autopackage), and then went to LinCity website and installed that directly off of their site (without saving to disk first and running). Worked like a charm. I’m very impressed and I think it’s a good solution for third-party apps. Much easier than looking around to find if someone made a working DEB for an app I want that isn’t in the Ubuntu universe/multiverse.
I like the dev’s thinking too.
Congratulations.
This new installer looks interesting, but as far as I can tell from the documentation there is no way to get information about what you have installed. E.g ask questions like:
What package installed this file?
What packages depend on this file?
What does this package do?
Is there a newer version of this package for download from the developer?
But I might be wrong, I havn’t had time to check it out in real life yet.
I’m also a bit worried that there is no digital signing. Not that digital signing have had much success on windows as people just press OK regardless who have signed it. But it is important for automated uprgades where it could prevent things like DNS spoofing attacks.
One of the things that I feel is great in this, is that it encourages cooperation among developers, it is no longer the job of the distributer to make the program work well with other software but the developer himself. This will enforce standards and in the case where there are no standards it will be an incentive for developers to create one, if only a defacto standard.
Finally it would be nice, if Linux could provide low level hooks for things like removing files so that package management systems could be configured to warn the user or even uninstall packages damaged due to such file removal.
Autopackage is patchwork, integrates poolry with the desktop, doesn’t integrate at all with the system, and eventually _WILL_ break your system. Don’t get me wrong, i like the concept, and i really used to like autopackage, but the fact that they released an incomplete version as 1.0 makes me angry.
Either you do it right, or you don’t do it at all; right now autopackage is *nothing* more than any other package manager under the sun.
right now autopackage is *nothing* more than any other package manager under the sun.
Except that I could go to a developer’s website and install their program without having to search the web to find a package made for my distro, or a repository containing it, and without the developper having to make a package for each distro either. What package manager do you know does that?
Linspire , Xandros , Lycoris ……some of the more difficult
distro’s to get packages for , ever try linspire or xandros ?
just try finding some apps….good luck
Ever see a package for FC and your running mandrake or suse ?
just a pain to even attempt to get the dependencies resolved
but now theres light at the end of the tunnel
Autopackage
Autopackage is patchwork, integrates poolry with the desktop, doesn’t integrate at all with the system, and eventually _WILL_ break your system.
If you’re so scared, you can install everything as user – that can’t possibly break your system.
And did you see how much time it took to get to this point? 2 years and a half. This wasn’t born yesterday, so i think it’s unfair to call it “patchwork”. This was the perfect time to release 1.0.
Victor.
Autopackage is beautiful work and is what Linux needs to succeed on the desktop. I can’t thank Mike Hearn and the rest of the crew enough for the dedication to make it happen and the vision to rise above the artifical walls that the distributions have built.
These walls only benefit the distributors themselves and not the end users who are all clamoring for a common and easier way to install software.
I am looking forward to seeing what you can pull by the time Longhorn ships. If we have autopackage widely adopted and deployed by most developers by then, Linux on the desktop will be kicking some major ass.
Imagine, KDE 4.0 or Gnome 3, OpenOffice 2.2, autopackage and very advanced and evolved 2.6 kernel. Can life get any better?
I remember when Mike Hearn first wrote several years ago that he intended to take on the packaging problem. I thought then that he was very brave to try to slay the dragon that has flamed many knights before him.
He has produced a fine effort, and is to be congratulated. It’s a very PRACTICAL approach, and looks like it might gain some traction.
It’s a bit sad that there is a lack of leadership amongst large Linux supporters such as OSDL and its sponsors when it comes to Linux packaging. I guess they’re afraid of a backlash and being labelled heavy-handed. The LSB is great, but doesn’t go nearly far enough.
The solution has been known for some time. Mike covers it in his FAQ http://autopackage.org/faq.html
“Essentially, software is easy to install on Windows and MacOS not because of some fancy technology they have that we don’t – arguably Linux is the most advanced OS ever seen with respect to package management technology – but rather because by depending on “Windows 2000 or above” developers get a huge chunk of functionality guaranteed to be present, and it’s guaranteed to be stable.
“In contrast, on Linux you cannot depend on anything apart from the kernel and glibc. Beyond that you must explicitly specify everything, which restricts what you can use in your app quite significantly.”
…
“A desktop Linux platform would help, because instead of enumerating the dependencies of an app each time (dependencies that may be missing on the users system), application authors can simply say that their program requires “Desktop Linux v1.0″ – a distribution that provides platform v1.3 is guaranteed to satisfy that. Support for a given platform version implies that a large number of libraries are installed.”
There’s the answer. It would do away with dependency nightmare, and make installation a snap. But it requires co-operation and a willingness to abandon one’s own sacred turf. Like everything else Linux, it should be a voluntary thing – people/distributions adopt it because it works for them. You don’t want it in your distribution? No sweat.
Until that happens, any packaging tool is fighting a losing battle – either horribly complex or niche-restricted.
I just tried this on my Gentoo Box and it worked great, no problems at all.
It updated the menus on gnome and KDE 3.4 without any issue.
It works very similar to the way Installshield or Wise or Inno works on windows, just download the package and install.
Very impressive.
Congrats to Mike Hearn and the Autopackage team. Great job!
To those F-n people putting down this project, why don’t you just start coding instead of talkin theoretical shyt. Perhaps you could come up with something better? ALL TALK NO WALK!
I’m proud to be a linuxer and live these glorious days! Enjoy the NOW because history is being written!
Congratulations Mike Hearn!
“Linspire , Xandros , Lycoris ……some of the more difficult
distro’s to get packages for , ever try linspire or xandros ?”
Aren’t Linspire and Xandros based on Debian? They should be able to use the Debian packages just fine AFAIK. Admittedly, I haven’t tried them myself though (kubuntu user), so YMMV.
Lycoris is based on Caldera (SCO). I doubt there are too many developers releasing s/w for that nowadays..
how it figure out if a binary can be installed on a system? In case its not possible, what it does?
I’m asking because, since it is supposed to allow an “universal” binary install with your system dynamic linked libraries, and on linux you compile things with dependencies (despite what some people are saying here, as you do on Windows and OSX too) I would like to know what is their magical solution (and, no, compile from source isn’t enough because it isn’t hard to do that with your systems tools anyway to create a native package).
I would like that someone answer these questions:
1) How it handle shared libraries;
2) How it get the distro configure flags for the package dependencies (so in case a compile is needed it doesnt screw up the system);
3) How it manage default locations of a distro;
4) What about patches that are done by maintainers to customize some behaviours;
5) What about packages classes (system, usr, local, …).
I still think the only way to fix this problem is with a default system/packages metadata that all distros should publish in a standard way. All others ‘solutions’ looks unsafe to me.
Nothing saying that AP isnt a good tool. Only that I don’t trust it as a silver bullet, and still prefer to use the distro tools (rpm, apt, …).
This must be the fiftieth time I’ve written this OSAlert comments, but THE FORMAT IS NOT THE ISSUE. It doesn’t fucking matter that Fedora uses RPMs and Debian uses .debs. You could switch them over with a month’s work and it wouldn’t make a difference. The *real issue* is that all distributions have different combinations of libraries, build options, directory structures, configuration conventions and so forth. *That* is why things have to be built on every different distro, it’s got nothing on earth to do with the format of the file (otherwise you’d only need the One True .Deb and the One True .RPM for every program, and you’d be done).
You said exactly what I was trying to say.
>>>>Aren’t Linspire and Xandros based on Debian? They should >>>>be able to use the Debian packages just fine AFAIK. >>>>Admittedly, I haven’t tried them myself though (kubuntu >>>>user), so YMMV.
>>>>Lycoris is based on Caldera (SCO). I doubt there are too >>>>many developers releasing s/w for that nowadays..
Actually using deb repositories on linspire could very well
and very easily break your system
as well i have tested autopackage on linspire and it works flawlesly
so autopackage works just perfect for such a situation
as for lycoris being SCO based , if no one is developing for it , then it too is a perfect candidant for autopackage
One size fits all
i see no reason to argue why that shouldnt be
>>>>>This must be the fiftieth time I’ve written this OSAlert comments, but THE FORMAT IS NOT THE ISSUE. It doesn’t fucking matter that Fedora uses RPMs and Debian uses .debs. You could switch them over with a month’s work and it wouldn’t make a difference. The *real issue* is that all distributions have different combinations of libraries, build options, directory structures, configuration conventions and so forth. *That* is why things have to be built on every different distro, it’s got nothing on earth to do with the format of the file (otherwise you’d only need the One True .Deb and the One True .RPM for every program, and you’d be done).>>>>>>>>>
Which is exactly why standards are needed , not to rule all under one standard , but so a app such as autopackage can work with a set standard for libs and paths
or at the least a standard to list each distros config
as to where it can be read by a 3rd party app such as autopackage so as to know in which way to install the apps
as for the different libs , i cant say im sure on how that part is handled in this case ….but there is always room for partial recompiling during installation or staticly compiled apps if all else fails
It might not be the best solution (hell, it might be, I don’t know), but it as a solution, and that’s always good.
Now I believe these came about because of the fact that I downloaded the files on a XP system, then copied them to a USB disk then copied that onto my KDE desktop. So I had to install inkscape manually which is a pain in Slackware since it needs at least two or three dependant libraries gtkmm and so. Fortunately because I compiled and installed those libraries and used checkinstall they are now part of the my Slackware systems packages. I also installed them specifically into /usr so that any other app in the future can use them. I don’t know if the autopackage would have done so.
Is it possible to have Autopackage install your app in your local directory but the apps dependencies in /usr?
Regarding the string “What idiots… ” in the title. Don’t worry too much about that – the article is written by our lead developer and maintainer Mike Hearn.
(Mike even wanted to remove it, but got convinced not to)
Does autopackage also have a convenient and clean way to uninstall things? I don’t know how others think about this, but I would be very reluctant to install autopackages if there was no convenient and clean way to uninstall stuff. Is there a way to get an overview of all installed packages?
sure, some level of standardisation is a good idea. It’s not what pooping said, though, he just brought out the same old ‘everything should use .debs / .rpms and the world will be safe again!’ idea that gets posted here every time there’s any article about packaging. *sigh*
Does autopackage also have a convenient and clean way to uninstall things? (…)
Yes, as of version 1.0 it installed a graphical manager to list/remove installed packages (in Gnome,FC3 it’s ‘system tools’->’manage 3rd party programs’). It correctly picked up the packages I had installed with previous verison, no problem. It’s a clean, nice interface, clearly stating size, time of installation and if root password is required to remove a given package. Also, you can always ‘package remove pkgname’ from the command line.
All in all, I am usually ok using RPMs, synaptic and yum but autopackage was one of the slickest,smoothest user experiences I had in linux. I can see it as a great way to complement centralized repos, logically separating the management of ‘applications’ from the management of the ‘system’, that’s sort of needed really in a truly multi-user OS like Linux.
Yes, you can uninstall software from the commandline or the GUI.
Commandline: package remove (package name here)
GUI: Applications->System Tools->Manage Third Party Software
Re: Yeah, right!
“Autopackage is patchwork, integrates poolry with the desktop, doesn’t integrate at all with the system, and eventually _WILL_ break your system.”
Come on, what do you think we’ve been writing for the past 2.5 years? Do you really think we’d release 1.0 if it really fscks your system?
Autopackage tries very hard to properly integrate with your system. It autodetects the correct location to install .desktop files to, it modifies files just to make sure that menu items show up, it updates the linker cache to make sure app will work, it updates environment variable to make sure that the app can launch, it makes sure binary compatibility issues (like the GLIBC_2.3 symbol stuff and bogus dependancies) are resolved, etc. etc. Other package manager don’t even try to cover all this stuff.
Imagine if the N distros out there adopted one packaging format…
That wouldn’t make any difference.
The problem is not the different package formats, but the different filesystem layouts and dependency trees, e.g. you can’t safely use RedHat packages on Suse and vice versa, even though they both use RPM.
I would like that someone answer these questions:
1) How it handle shared libraries;
It checks the system for them directly during the prepare stage, by scanning the linker cache and the LD_LIBRARY_PATH.
Sometimes a library might be present, or might not, depending on the distro and what the user has installed before. Long term the solution is the platform thing I keep banging on about. For now, we have a variety of techniques like relaytool which makes it very easy for developers to dlopen functionality and cleanly disable features if the supporting library isn’t present.
Autopackage supports the concept of “recommended” dependencies, to complement relaytool.
2) How it get the distro configure flags for the package dependencies (so in case a compile is needed it doesnt screw up the system);
It obviously doesn’t. Software cares about interfaces: if the interfaces it needs are implemented, it’ll be OK, if they aren’t in won’t. If your distro is configuring things such that they aren’t providing the interfaces upstreams version normally would, you have problems.
In reality we’ve been distributing multi-distro packages for years, since the start of the project, and this has never been a problem.
3) How it manage default locations of a distro;
Default locations? It doesn’t, it always installs to /usr or $HOME/.local right now. In future when we have better KDE support we may install to /opt/kde on SuSE as well, because KDE apps tend to be unhappy if they aren’t installed to the KDE prefix.
4) What about patches that are done by maintainers to customize some behaviours;
It doesn’t care, unless the patches change public interfaces. In that case you’re screwed, switch to a distro that isn’t breaking compatibility with the rest of the open source world.
Again in practice this has never really been a problem.
5) What about packages classes (system, usr, local, …).
That’s a term I’ve not seen before, so I can’t comment.
At first, i’d point again to the link Bact gave: Openoffice using autopackage (http://qa.openoffice.org/issues/show_bug.cgi?id=46333). Just register and vote people
2nd:
Eugenia, since you’re misinterpreted my previous post I’m asking it here again:
What about ASKING all developers on gnome-files to provide autopackages of their programs?
Of course, developers shouldn’t be obligated to use autopackages. They’re free to choose whatever they want, but the succes of Autopackage depends on the amount of developers which are going to use it.
Of course, if Eugenia doesn’t like this everyone is free to ask for an autopackage in the comments section of each program. What do you think?
To be honest I’m a bit disappointed in the attention which autopackage got from the ‘big’ guys:
– Not a single post from a gnome developer on Planet Gnome
– Only 2 volunteer hackers, where are e.g. the fulltime Novell hackers? Apperantly they’re not interested
Nevertheless, I wish Mike Hearn & his team the best!
I’m liking Autopackage. Finally I do not have to compile/install xxxx library for which I know is generally useless and not absolutely required by a needed application.
I no longer have Gnome Libraries on my system. It’ll be interesting to see if this will cause any hickups with autopackaged software that do not exclusively maintain themselves within Gnome Environs but do use certain Gnomish functions for select things.
I really like being able to install applications into my home directory and run them as if they were systemwide. Fantastic.
I’ve installed some of the example packages as a user and it’s great to be able to do so that way. ans it opens many opportunities :
It allows the creation of a true linux download repository on the web ! Magazines can also now really include packages on CDs and DVDs for linux.
Also it would create a new point of focus for distros : with apps distributed separately, it would allow them to get back to focusing on the core system and just making sure they don’t have problems with most autopackages out there. I’ve never figured what was the point of EVERYONE repackaging things forever.
It surely has pitfalls but I am sure that patches will be looked at by the project, especially that the pitfalls are most likely to be spotted by developpers….
Every linux distro has a native package management, period. Telling a beginner that he should install non-native packages on his system means telling him how to render his system inconsistent most quickly. This will scare people away from linux instead of attracting them!
We don’t need yet another package manager, we already have an almost perfect one, the R Package Manager:
http://www.linuxbase.org/modules.php?name=FAQ&myfaq=yes&id_cat=11&c…
Instead of yet another package manager, we need
– Nicer graphical frontends for RPM that we already have (Has anybody had a look at the StarOffice8 installer? It’s brilliant! It consists of RPMs, extracts itself into /var/tmp and installs the RPMs interactively and has a nice GUI. *This* is what we need.)
– More standards-compliant distros and more standards-compliant packages (e.g. the RealPlayer and AdobeReader RPMs are *not* standards-compliant, they install themselves into /usr/local by default which is reserved for manually installed software, instead they must install themselves into /opt which can at the moment only be achieved by passing an additional parameter, but this should be the default.)
RPM offers all the functionality we need. RPMs are relocatable (if you didn’t know that, read the man page -> RTFM!), RPMs are suitable for both distributors and third-party packagers, RPM is adopted by all important Linux vendors (Redhat + SuSE + maybe Turbolinux) and RPM is flexible and can be used with different frontends (urpmi, apt, yum, up2date, yast, …). The only thing that is missing right now is a more user-friendly wizard-style graphical installer frontend, since most beginners are not familiar with the command line.
I’d while I like the idea of autopackage’s relocation, the fact of the matter is that for non-casual use, you really want the repository and auditability features of RPM.
With apt4rpm and urpmi, the issue of dependency resolution is largely solved (autopackage doesn’t fare well here) so long as we’re fairly consistent about naming packages…
Actually, the issue with distro-specific RPMs would be a non-issue if the following two things were addressed: a standard macro set for ALL common install/uninstall tasks, and some agreement on naming conventions.
The macros need only have the same name, but what the do for the distribution could be distribution-specific:
%post
%add-menu-item Development/IDE/Eclipse %{_bindir}/eclipse %{_datadir}/icons/eclipse.png
or
%post
%add-set-env JAVA_HOME %{_sysconfdir}/alternatives/java_sdk
.. you get the idea. Each macro would invoke a script that carried out the function, and the distribution would simply distribute different implementations of the scripts to suit themselves.
In fact, I find myself doing similar things all the time. The JPackage project already provides a script and macro package that does this specifically for packaging up Java packages.
Actually, the issue with distro-specific RPMs would be a non-issue if the following two things were addressed: a standard macro set for ALL common install/uninstall tasks, and some agreement on naming conventions.
There was an attempt at the latter several years ago, under the aegis of the LSB. I even wrote one of the draft specs. It went nowhere. Distros weren’t interested. The Gentoo representative was at first supportive then disappeared – political games over the future of portage, it turned out. The Debian representative believed asking for naming conventions (!) was too much, aiming too high.
In fact the most supportive member was Jeff J from Red Hat who always had time for this effort and was behind it all the way. Nonetheless, one distro was not enough.
It’s alright to speculate about optimal solutions in a theoretical world where everyone uses RPM and the same conventions, but that optimal world does not exist. Speculating about it is a waste of time.
“Come on, what do you think we’ve been writing for the past 2.5 years? Do you really think we’d release 1.0 if it really fscks your system?”
It installs stuff into /usr. On any distribution, that *will* eventually break things, like the distribution’s package management. You can’t get around that unless you change back to /usr/local, bug reports and all.
What about optimization !!!
Developper will have to create installer for all Architecture and all iX86 ?
Sorry, I don’t agree. Things will only “break” if you try and install the same program from two different packages, or your distro attempts to install new applications for you (which is itself somewhat broken, IMO).
The likelyhood of this happening is low. Even if it did, only the autopackage db would be inconsistent and it’s made up of text files and directories, so you can fix it by doing a few rm -rfs.
Basically, we’ve never had reports of problems caused by this. While I’m sure eventually some distro will decide that even though the user uninstalled the package, they really really did want Gaim after all, and the user installed copy will be replaced with the distros. It’s easy to fix when this does happen.
If you don’t like it, why not help implement RPM integration so it’s guaranteed not to?
I have a question about Autopackage.
I have hardware compatibility issues that disallow me internet access. Because of this, the way Autopackage (and any other package management system for that matter) resolves dependencies is useless to me. My question is, can a developer make a “static” .package that has all the dependencies in the same package? If so, will it install those dependencies in the proper location in the filesystem, or will it install the dependencies in the same location as the app? Is there a way to control this action? Will it update old libs with new ones? For people like me, things like this would be a breath of fresh air.
because KDE apps tend to be unhappy if they aren’t installed to the KDE prefix.
As long as you expand KDEDIRS properly, it shouldn’t matter.
For example if you install into $(HOME)/.local you could install a script (file ending in .sh) like this
export KDEDIRS=$KDEDIRS:$HOME/.local
into
$KDEHOME/env/
if $KDEDIRS doesn’t include it already.
(assuming KDE >= 3.3)
> I have hardware compatibility issues that disallow me internet access. Because of this, the way Autopackage (and any other package management system for that matter) resolves dependencies is useless to me. My question is, can a developer make a “static” .package that has all the dependencies in the same package? If so, will it install those dependencies in the proper location in the filesystem, or will it install the dependencies in the same location as the app? Is there a way to control this action? Will it update old libs with new ones? For people like me, things like this would be a breath of fresh air.
# # # #
Well you have a couple of questions in there.
Autopackage does support being built as a `sealed’ installer. The dependencies can be located in the package payload for use and installation as needed.
There is no split between installing a package and its libraries. Any sub-package (required library) is installed into the same prefix location as the application. So there is only one question, install everything into system or $HOME. This reduces interaction. Overall should we ask where each library package should be installed when installing into $HOME? This really requires to much knowledge on the user. Also there is no categorization of packages so to install into different places depending on type is a non-starter.
There is currently no updating service. If it already exists, then it is uninstalled and reinstalled. The use of luau could be further expanded to give a listing of out of date software. Luau itself has an X-frontend that gives this information. http://luau.sf.net/
Enjoy,
Curtis
this take care of one of the two problems of installing stuff under linux for users.
now we need a system where one can have multiple versions of the same lib installed. on most distros if you try to install a app that needs a very new version of a lib then it will overwrite the older version. often this will work out fine but at times you risk breaking some app or other that need that older version.
yes you can either look for a new compile of the broken app thats made to work with the new lib or you can compile yourself. but it may require a rewrite of the broken app as the new lib can have removed/rewritten some of the functions that the broken app needs. and if we are talking a production-critical app then either may take to long.
yes it may be a security problem but if the admin is made aware of this then the problem is sqarely in the hands of the admin.
as for autopackage walking all over the /usr folder, that will only happen if one trys to install a autopackage that have the same content as something thats allready in place. and whats the likelyness of that happening as autopackage have dependency checking that actualy looks for the presence of files, not just checking a package database. so even with a source based install done beforehand you should not see any stomping of files.
allso, if one worrys that a user initiated install is about to stomp all over the /usr folder. isnt it maybe time to review the write rights for that area of the system? or maybe someone have access to a root password they dont need?
I think you’re overlooking the possible problems, frankly. Just some theoretical scenarios off the top of my head:
RandomDistroA ships gaim.
Three months later, Joe User wants a newer version of gaim. He installs an autopackage, which installs to /usr and nukes the packaged gaim that’s living there. RandomDistroA still *thinks* it’s got its own gaim installed, though.
Joe User upgrades his RandomDistroA. Surprise, it’s now got a new gaim, so it attempts to upgrade the existing package.
If what happens now is pretty for everyone, I’d be rather surprised. Compare to what happens if /usr/local is used – there are now two copies of gaim, the autopackaged one and the distro one, living together in perfect harmony, with no-one’s package database borked.
I’ve seen more than enough problems from personal experience caused by installing non-packaged software to /usr , is the bottom line. You’re not going to convince me it’s either a good or only-borderline-bad idea.
That’s what native package manager integration is for. It’s planned for post-1.0.
I’m going to harp on this too. I think /usr/bin,/etc,, /usr/lib … should be reserved for distribution software. /usr/autopackage/man, /usr/autopackage/etc, /usr/autopackage/bin… make things safe. Or if the point is non root users can install then even better autopackage stuff defaults to
~/autopackage/bin, ~/autopackage/lib…
Finally /opt directory is for commercial software. If you see this as mainly for commercial that might not be a bad choice.
The terms used in the FHS are ‘locally installed’ software in /usr/local and ‘third party’ software in /opt, IIRC. This is a little ambiguous, autopackage could potentially go into either. For practical reasons I incline towards /usr/local (for instance, with the default partitioning on most distros, there’ll be a lot more space available here than in /opt). /usr/autopackage is definitely not right, the FHS expressly states that applications should not have their own directory directly under /usr.
Given the problems not having it could cause in conjunction with the /usr thing, _either_ /usr/local harmonisation _or_ native package manager integration really should have been planned for 1.0, IMHO.
The first part of the native package manager integration is about testing to see if the software was already installed and removing it if possible. If it was installed via a package manager, that means asking the package manager to remove it.
No, this doesn’t currently exist. It’s not hard to implement, and people wanted what we already had to be out and stable. These issues will be resolved with time.
Sorry but having to compile 50k apps isn’t my deal..
autopackage isn’t an application in the same sense. Its a way of installing applications that is outside the normal framework. I agree general apps shouldn’t be directly under /usr, but autopackage is different since in theory you could have 50 apps under /usr/autopackage. For example on Mac fink uses /sw and darwin-ports uses /opt/local and that way neither one screws up the Apple operating system when you install additional stuff.
on my slackware 10.0 system.. and I love it!!! This autopackage thing is GREAT