View Single Post
Old 13-08-2003, 19:51   #135
BenH
Inactive
 
Join Date: Jul 2003
Location: South Manchester
Posts: 74
BenH is an unknown quantity at this point
Quote:
Originally posted by DeadKenny
I get extremely concerned about the number of kernel updates with Linux (many security related, especially the ICMP flaw). This is the core of the operating system and should be solid and stable with no need to update on a regular basis. What's so cool about having a "new" kernel all the time? I update a lot of stuff on RedHat without worrying too much, but the kernel updates I investigate thoroghly just to see what's been changed.
The kernel is under constant development 24 hours a day as a result the development cycle is way, way faster than a commercial program hence their can be 2 kernels released in a single week. However you do not have to install them or even patch them. One of our Postgre servers is still running on 2.4.6/SuSE 7.3 without any stability problems and has been running non-stop since it was turned on 18 months ago.


Quote:
That's what I like about the NT line of Windows. It's still good old solid NT kernel underneath that I can trust and each version builds on it's core stability. The bugs are all with the add-ons. Sure, they are considered "part" of the OS because Microsoft wrote them all (or at least bought the companies that did ). It's no different with Linux apart from who "owns" what. It's still a core kernel and OS and then other apps on top.
Solid, Stable, Trust and NT do not belong in the same sentance. NT is essentially a fancy microkernel similar to the Herd, Linux is monolithic. Monolithic kernels are inherently stable due to the lack of intercommunication betwen the processes. Sure they've come a long way from NT4 to NT5.1, but the uptimes dont even begin to compare.

Also you've failed to say why MS marketing department (which lets face it is the real sucess of the company) had NT 5 renamed to 2000...


Quote:
As a developer in a commercial environment, I hate open-source. It really slows down the development process and you end up fixing everyone elses bugs just to get things working, which ultimately costs the company more in man-hours. I've experienced this a lot and I'd much rather the company pays for a commercial product, thoroughly tested by professionals, with certification and decent QA (rather than testing by 1000s of 12 year olds who don't have huge salaries and a job at stake as their incentive to ensure quality [/B]
And here we come to the rub, let me guess, your a .NET developer. The same .NET that Gartner pointed out was a huge security nightmare.

Well I'm also a developer, mainly for 8 and 16 bit microprocessors using C and ASM for R&D companies and I can categorically state that open source software is by far superiour to its closed source equivalent. GCC and GDB are frikkin godsends (and this is from an Atheist). OOo outperforms Office without breaking a sweat. MySQL and Postgre walk all over SQL Server because they actually follow the ANSI standards, likewise with Mozilla and likely Chandler. A couple of months back I saved an art department £30K by showing them the GIMP for 15 mins rather than Photoshop. Heck you can now even get groupware free thanks to skyrix from http://opengroupware.org . Apache runs some 60+% of the worlds webservers, compared to IIS 30%. The list goes on and on.

As for your claims of testing, well I guess you never heared of the OSDL? Or the way IBM, Oracle, Novell, SUN et all are fully behind linux and do alot of the testing in conjunction with the major distros. Infact the only major software company that isnt backing Linux is your paymaster. Their too busy being afraid of it and using others to spread FUD.

The only 12 year olds writing wild code are the script kiddies making your paymasters customers/victims life unplesant.

Regards,

Ben
BenH is offline   Reply With Quote