View previous topic :: View next topic |
Author |
Message |
jmcphe n00b
Joined: 28 Mar 2003 Posts: 7
|
Posted: Sun Jul 06, 2003 4:32 pm Post subject: |
|
|
Nicing portage is one thing. I simply don't let gcc go multi-threaded. That way it only runs away with 1 of my cpus. Sure, it's degradation, but if you run it after 22:00 or on the weekend, there is no observed end-user impact.
Now if we could only figure out a way to compile the packages on 1 system, then shipped the compiled binaries off to the 100 other systems in prod, we might have a business case to get gentoo into the enterprise world. |
|
Back to top |
|
|
madchaz l33t
Joined: 01 Jul 2003 Posts: 993 Location: Quebec, Canada
|
Posted: Sun Jul 06, 2003 10:36 pm Post subject: |
|
|
jmcphe wrote: |
Now if we could only figure out a way to compile the packages on 1 system, then shipped the compiled binaries off to the 100 other systems in prod, we might have a business case to get gentoo into the enterprise world. |
That one's not exactly hard to do. It's mostly a question of having the same basic architecture on all your servers. If you do, compile it on one then just have a cron job use your favorite file transfer protocol (ftp, smb, nfs, ssh) to send it to the other machine. I'd personaly recomend SSH for it's security and compression ability. Would take a bit of scripting, but it can be done.
You could also use a shared sandbox environment. Use a file server and put the portage sandbox in a network share. Then it's just a mather of having portage on the other machines get the compiled from the sandbox and merge it in the file system of the main machine. This could be done using a script that logons to the prod machines using, say SSH, then gets portage to merge the files. This would also allow you to target what machine will get the update.
you could also use chroot to basicaly put your compilation machine at the root of your production system. then most of the work gets done on your compilation machine, using a make.conf optimised for your prod machine, but the CPU of the compilation machine does most of the work. All that gets used on the prod machine is a bit of bandwidth. If you put your compilation machine in a dif network (say, using a private compilation network between just that machine and the prod ones) you don't even tie up prod bandwidth. what litle CPU would be used on the prod machines would be insignificant.
just some toughts
edited to remove some typos and added some more ideas _________________ Someone asked me once if I suffered from mental illness. I told him I enjoyed every second of it.
www.madchaz.com A small candle of a website. As my lab specs on it. |
|
Back to top |
|
|
tdb Apprentice
Joined: 19 Sep 2002 Posts: 293 Location: New Orleans, Louisiana, U.S.A. (what's left of it anyway...)
|
Posted: Mon Jul 28, 2003 5:42 pm Post subject: |
|
|
jmcphe wrote: | Now if we could only figure out a way to compile the packages on 1 system, then shipped the compiled binaries off to the 100 other systems in prod, we might have a business case to get gentoo into the enterprise world. |
Portage does that already. From the manpage:
Code: |
OPTIONS
--buildpkg (-b)
Tells emerge to build binary packages for all ebuilds processed
in addition to actually merging the packages. Useful for main-
tainers or if you administrate multiple Gentoo Linux systems
(build once, emerge tbz2s everywhere). The package will be cre-
ated in the ${PKGDIR}/All directory. An alternative for already-
merged packages is to use quickpkg which creates a tbz2 from the
live filesystem. |
|
|
Back to top |
|
|
asv Tux's lil' helper
Joined: 25 Jul 2003 Posts: 138 Location: State College, PA United States
|
Posted: Mon Jul 28, 2003 9:38 pm Post subject: just a thought |
|
|
I could never see running gentoo on a production system that was mission critical. Such systems cannot not have precious cpu cylcles taken away to compile updates. Does gentoo stables address this issue at all? |
|
Back to top |
|
|
EvilTwinSkippy n00b
Joined: 20 Feb 2003 Posts: 63 Location: Philadelphia, PA
|
Posted: Wed Jul 30, 2003 6:18 pm Post subject: Gentoo on servers... |
|
|
Frankly folks, I think a lot of us are wasting grey-matter cycles wondering about computer cycles.
For my part I have 9 identical servers. One is my sacrificial "compile on me" server. It hosts a universal copy of my portage tree over NFS, the others leach from it. (The busy server also has every package installed in use on the network.) My "busy server" builds packages. I also work out all of the nasty nasty details with package X slugs package Y requiring a rebuild of package Z. The fruit of this labor is saved on the network, and when emerge runs on my other boxes, they suck down the pre-built binaries. There is still a lot of wailing and nashing of teeth with any upgrade, but at least I don't have to worry about compile time!
A word of warning, you can get into a world of trouble if you aren't careful! I had a box that I performed one binary update of python on without updating GLIBC. Suffice to say, it's time to whip out the live-CD and get medival.
I'm starting to adopt a system of releases rather than piecemeal upgrades. Makes life simpler. _________________ I've found that people will take what you say more seriously if you tell them Ben Franklin said it first. |
|
Back to top |
|
|
bmichaelsen Veteran
Joined: 17 Nov 2002 Posts: 1277 Location: Hamburg, Germany
|
|
Back to top |
|
|
heresiarch n00b
Joined: 29 Jun 2003 Posts: 5 Location: Needham, MA
|
Posted: Thu Jul 31, 2003 4:48 am Post subject: build once - emerge everywhere! |
|
|
I'm currently attending Olin College , and all students are required to have the most current rev of the school laptop. Since me and some friends have started using Gentoo on a few old P3 boxes as students servers and on all of our laptops, we have had lots of success with passing around binary packages. We don't have it automated yet (we hope to when the school year starts) but we currently informally ask around before doing big installs (kde, X, etc) and see if anyone else has the packages for our laptop. This saves vast amounts of time, at zero cost to optimization. This is my dream environment for gentoo - we get all the great fast packages, without the annoying compilation step. Eventually, we want to build a stage2 tarball for our laptops to help new users install gentoo on campus. Ultimately, it would be really cool if we could just image the drive and copy/paste a working gentoo setup with wireless all setup, things like i8kutil ready, all our local network drives auto-mounted, etc. Should help getting more students on gentoo.[/url] |
|
Back to top |
|
|
madchaz l33t
Joined: 01 Jul 2003 Posts: 993 Location: Quebec, Canada
|
Posted: Thu Jul 31, 2003 5:34 am Post subject: |
|
|
Quote: | I'm currently attending Olin College , and all students are required to have the most current rev of the school laptop. Since me and some friends have started using Gentoo on a few old P3 boxes as students servers and on all of our laptops, we have had lots of success with passing around binary packages. We don't have it automated yet (we hope to when the school year starts) but we currently informally ask around before doing big installs (kde, X, etc) and see if anyone else has the packages for our laptop. This saves vast amounts of time, at zero cost to optimization. This is my dream environment for gentoo - we get all the great fast packages, without the annoying compilation step. Eventually, we want to build a stage2 tarball for our laptops to help new users install gentoo on campus. Ultimately, it would be really cool if we could just image the drive and copy/paste a working gentoo setup with wireless all setup, things like i8kutil ready, all our local network drives auto-mounted, etc. Should help getting more students on gentoo.[/url] |
Another thing you could all do is use distcc. once in college I mean. just have all your computers compile for everyone. that way you get extremly fast compiling because all the computers work on it. make for kick ass compile time and would be a great learning experience. just imagine being able to compile on 20 comps at once. _________________ Someone asked me once if I suffered from mental illness. I told him I enjoyed every second of it.
www.madchaz.com A small candle of a website. As my lab specs on it. |
|
Back to top |
|
|
heresiarch n00b
Joined: 29 Jun 2003 Posts: 5 Location: Needham, MA
|
Posted: Thu Jul 31, 2003 12:28 pm Post subject: |
|
|
Yep, we've already got that setup. The problem is, it's hard to use the laptops themselves as distcc hosts - they are constantly turning on and off and are almost always on wireless. So for now, we have the P3 boxes acting as distcc hosts, which has worked well. We've had to mess with the balance to keep jobs on all machines, but it's still working well. |
|
Back to top |
|
|
wbreeze n00b
Joined: 06 Aug 2003 Posts: 53 Location: Langley, BC
|
Posted: Wed Aug 06, 2003 11:36 pm Post subject: |
|
|
tdb wrote: | jmcphe wrote: | Now if we could only figure out a way to compile the packages on 1 system, then shipped the compiled binaries off to the 100 other systems in prod, we might have a business case to get gentoo into the enterprise world. |
Portage does that already. From the manpage:
Code: |
OPTIONS
--buildpkg (-b)
Tells emerge to build binary packages for all ebuilds processed
in addition to actually merging the packages. Useful for main-
tainers or if you administrate multiple Gentoo Linux systems
(build once, emerge tbz2s everywhere). The package will be cre-
ated in the ${PKGDIR}/All directory. An alternative for already-
merged packages is to use quickpkg which creates a tbz2 from the
live filesystem. |
|
When you use one system to compile a package for a whole lotta servers, what do you have to do in regards to emerge sync. Do you have to make sure they are all sync'ed to the same point or does it not matter? Is there some way that you can bundle up a tgz of all the data that came from emege sync, or is that not needed? |
|
Back to top |
|
|
Hmzaniac n00b
Joined: 18 Jun 2003 Posts: 3 Location: In a serverrack
|
Posted: Wed Aug 06, 2003 11:57 pm Post subject: |
|
|
The way i have it:
export /usr/portage read-only over <insert favourite network file system here>. Mount it on your clients. This means you only have to emerge sync on your server, and if you build packages in /usr/portage/packages, that will also be exported, and can be used for emerging precompiled packages. _________________ Users are the evil of all roots. |
|
Back to top |
|
|
Brazil Tux's lil' helper
Joined: 08 Apr 2003 Posts: 117 Location: Los Angeles, California
|
Posted: Thu Aug 07, 2003 6:08 am Post subject: Would be nice if... |
|
|
It would be nice if you could limit the amount of CPU usage when you are compiling something... I don't know if it could ever be possible, or if there is already a way to do it... Lower priority? A new user account and lower process priority?
But if CPU is important for your production environment, it would be cool if you could use distcc on all of your servers and only use 10% of CPU of each box... or just take a day to compile...
Just a thought... What is limiting from making this possible?
And as far as I can tell why a BSD is better than a Linux distro for a server, or just in general for stability/reliability is because the people who develop the BSDs are control freaks... Both Linux and BSDs use virtually the same common libraries... GNU is the OS!! (will cause flames) What would you get if Steve Jobs knew how to program? BSD! (more flames) It's nice to have a body of people to make sure what is going into a system... and I guess there is not as much as a coordination in developing in the Linux community... but I can just be as much as a control freak and do it all myself to make a Linux system just as good... and Gentoo lets me do that. (flames are burning me now)
Anyway... that's what I think... maybe I just said some of that to cause trouble, but I do think it is possible to do things just as well throughout w/ Gentoo as a server.
But wouldn't it be cool on an OS, if your could control the CPU usage for a compiler?
-Brazil |
|
Back to top |
|
|
Pythonhead Developer
Joined: 16 Dec 2002 Posts: 1801 Location: Redondo Beach, Republic of Calif.
|
Posted: Thu Aug 07, 2003 3:08 pm Post subject: |
|
|
I have not deployed it on various servers for two reasons, one which was stated before, compiling times. On some servers I don't have the time to take the servers down to compile. I can't have dozens of users affected by compiling and the nice command only goes so far. In some situations having a binary package that has been heavily tested is great.
The second reason was because of requirements of a commercial application vendor that only gives support on Red Hat machines.
That leaves a lot of other appropriate uses though, and I've deployed it happily elsewhere. |
|
Back to top |
|
|
Pythonhead Developer
Joined: 16 Dec 2002 Posts: 1801 Location: Redondo Beach, Republic of Calif.
|
Posted: Thu Aug 07, 2003 3:09 pm Post subject: Re: Would be nice if... |
|
|
Brazil wrote: | It would be nice if you could limit the amount of CPU usage when you are compiling something... I don't know if it could ever be possible, or if there is already a way to do it... Lower priority? |
From /etc/make.conf:
Quote: |
# PORTAGE_NICENESS provides a default increment to emerge's niceness level.
# Note: This is an increment. Running emerge in a niced environment will
# reduce it further. Default is unset.
#PORTAGE_NICENESS=3
# |
|
|
Back to top |
|
|
wbreeze n00b
Joined: 06 Aug 2003 Posts: 53 Location: Langley, BC
|
Posted: Thu Aug 07, 2003 3:31 pm Post subject: |
|
|
Hmzaniac wrote: | The way i have it:
export /usr/portage read-only over <insert favourite network file system here>. Mount it on your clients. This means you only have to emerge sync on your server, and if you build packages in /usr/portage/packages, that will also be exported, and can be used for emerging precompiled packages. |
Yeah, I think I've seen that somewhere before, but I guess what I had in mind was something where you dont have to have all the machines networked together, or a way to do it if I had a standalone machine somewhere that I wanted to update it from a cd of precompiled binarys. I think you answered my first question though. I would assume that in order to install an ebuild tgz, all the machines must be sync'ed, or sharing the same copy of /usr/portage?
But back to the other question:
Is there some way that you can bundle up a tgz of all the data that came from emege sync?
I guess I could keep a tar of /usr/portage but that seems a bit silly (it can get pretty big)
What happens if you try to install a brand new package on a system that isn't up to date with emerge sync? |
|
Back to top |
|
|
sschlueter Guru
Joined: 26 Jul 2002 Posts: 578 Location: Dortmund, Germany
|
Posted: Sat Aug 09, 2003 9:45 pm Post subject: |
|
|
cdunham wrote: | Not a big deal for an occasional security update in the middle of the night, but when the as-of-yet-undetected vulnerability in OpenSSL is discovered, and the only upgrade means a full system rebuild, yikes! |
That's why both versions 0.96 and 0.97 are maintained by the upstream developers |
|
Back to top |
|
|
sschlueter Guru
Joined: 26 Jul 2002 Posts: 578 Location: Dortmund, Germany
|
Posted: Sat Aug 09, 2003 10:04 pm Post subject: |
|
|
For servers, security and reliability are important.
The gentoo way of dealing with security vulnerabilities is to upgrade to a new version. But new versions of software package often contain more changes than just the security related patches. This may have some unwanted side effects and may break things somewhere else.
The command "emerge sync" is completely unsecure at the moment. There are lots of mirror servers out there, and if only one of them is operated by a malicious administrator or if the server got compromised, you may get a different portage tree and if you emerge an ebuild and run the software that was installed, your machine might be compromised as well. |
|
Back to top |
|
|
markan18 n00b
Joined: 17 Apr 2003 Posts: 11 Location: wonderland
|
Posted: Sun Aug 10, 2003 1:34 am Post subject: maintain gentoo systems using binary backages is a pain |
|
|
I have done what it has been said before in this thread. That is setup a compile server where i compile and create binaries for all the packages i need (emerge sync && emerge -ub world). Then, i export the entire /usr/portage tree via nfs. On all the others machines, i try to upgrade by mounting /usr/portage from the compile server and do a "emerge -uk world". It works for some packages but often portage does not "see" the binary package and try to pull the source and it fails because /usr/portage is exported read-only. When i try to install software that spans over a large number of packages (like KDE) its even worse. When kde is not installed and i want to install it, emerge -p kde and emerge -kp kde does not give the same results on the same machine, i wonder why.
The installation of kde using the sources works great but the installation using binary packages barely works because some packages appears to be missing. Also the make.conf (same USE for all) file is the same on all my machines and the compilation option are quite conservative. In fact, i compile for i586 because i still have some old pentiums around. Finally, i tried the "fixpackages" command but it has no effect. Thats why i can't use gentoo on more than a few machines because compile all packages on all machine is not an option, i need to be able to use binary packages. _________________ emerge -u beer |
|
Back to top |
|
|
Daemonfly n00b
Joined: 31 Jan 2003 Posts: 46 Location: Pa - USA
|
Posted: Mon Aug 11, 2003 6:53 am Post subject: |
|
|
In my situation, it's more of a hardware support issue.
An example is my Compaq ML350 dual 600Mhz server (no longer supported 1st model). It has the normal hardware, plus some Compaq upgrades/optional parts. FreeBSD 5.0 installs on it without a hitch - all hardware is found automatically, and works perfectly.
None of the Gentoo 1.4 versions, even release, install on it . Even the newest updated 1.4 live CD doesn't even boot completely. System boots from the Live CD, but then errors & says the cd is not a Live CD . Doesn't matter, as most of the hardware is only supported in Linux by hard to find drivers(rpms) for Red Hat - and had to do a pretty extensive search just to find most of them.
Gentoo goes on any other PC that it will work on. The ones that it doesn't work on are ones that, with my luck, have hardware that is not supported under Linux at all. I don't feel like going out & buying extra network/sound/video cards just to have Linux run on those PCs.
As for general problems. Yeah, compiling time and the Emerge system can be an issue on production servers. But, you don't have to compile or Emerge your software, it's just one option. |
|
Back to top |
|
|
meta n00b
Joined: 22 Aug 2003 Posts: 45 Location: Cambridge MA
|
Posted: Fri Aug 22, 2003 4:03 am Post subject: Hardware support |
|
|
Yeah, what he said.
I spent three days off-and-on trying to get Gentoo with EVMS to run stably on a quad SMP system. I tried EVMS 1, EVMS 2, multiple kernel versions, I even tried to build the RedHat kernel... It seemed to work, but when put under load it would lock up at random, not even giving me any kernel panic or error log output.
Then I installed RedHat 7.3, and that worked first time. |
|
Back to top |
|
|
nickrout Apprentice
Joined: 06 Oct 2002 Posts: 208 Location: New Zealand
|
Posted: Thu Sep 11, 2003 12:37 am Post subject: |
|
|
all this crap about not having gcc on a server is exactly that, crap.
BSD is widely considered one of the most stable and secure server OSes. By default on BSD machines you use their ports system to update software, ie you compile it from. Like gentoo they also have compiled binary packages, but like in the gentoo world real men don't install binaries.
Therefore I'd wager that a good number of those stable & secure BSD machines are running a compiler, and it doesn't seem to worry them. |
|
Back to top |
|
|
chiringuito n00b
Joined: 18 Aug 2003 Posts: 1
|
Posted: Thu Sep 11, 2003 1:04 am Post subject: |
|
|
It all comes down to what you need to run, and how you run it. I can crash BSD servers on the hour if I misconfigure/abuse them.
If the server is mission critical, test - test - test.
Hey people run 'mission critical' Windows servers, can Gentoo be worse?
Gentoo servers can be just as stable and fast, and if configured/administered properly - secure... |
|
Back to top |
|
|
fimblo Guru
Joined: 19 Feb 2003 Posts: 306 Location: European Union, Stockholm
|
Posted: Sat Sep 13, 2003 4:04 pm Post subject: |
|
|
On a completely other note- I saw your signature Axl, and just want to point out that:
axl wrote: | to paranode:
afaik the maximum tcp open connecions ever achieved was on a AIX ( arround 40.000 if i remember right ). 1.500.000 is not phisically posible. there are only 65535 ports on a single computer. even if all the ports are multithreading like apache for instance more then 1000 parralel conections on pentium 1 would be possible even with gentoo... ) so 1.500.000 is a lot of bull...t |
just for your info- the number of tcp ports (65535) and the maximum number of active tcp connections are not identical. You uniquely identify a tcp connection using src ip, src port, dest ip and dest port.
So in theory you can have a tcp connection anchored on your side ( eg w.x.y.z:5555) to each active tcp/ip system on the net simultaneously.
Of course, this is just theory in reality a regular system (end or intermediate system) wont be able to take millions and millions of tcp connections But if its a well-built system on good hardware with lots of memory, you should be able to take a few hundred thousand low-throughput tcp-connections....
/fimblo _________________ http://blahonga.yanson.org - little geekblog
http://blahona.yanson.org/howtos/livecd - yet another livecd howto |
|
Back to top |
|
|
cdunham Apprentice
Joined: 06 Jun 2003 Posts: 211 Location: Rhode Island
|
Posted: Mon Sep 22, 2003 8:22 pm Post subject: |
|
|
cdunham wrote: | By the way, a plug for the folks at The Planet. Probably could have gotten a better deal elsewhere, but they seem to have it together over there, and they have some fat bandwidth (and Gentoo!). |
Retract this. The Planet no longer supports Gentoo. _________________ This post more meaningful in a scalar context. |
|
Back to top |
|
|
Crg Guru
Joined: 29 May 2002 Posts: 345 Location: London
|
Posted: Mon Sep 22, 2003 8:53 pm Post subject: Re: Why, exactly, is Gentoo not "server" friendly? |
|
|
kres wrote: | Call my old fashioned... but when ppl say server, I think of stripped down, built for purpose OS.
|
And how exactly is having a complete compiler system, includes, etc a stripped down OS? |
|
Back to top |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|