tag:blogger.com,1999:blog-83353140754843939062024-03-17T02:17:43.624-05:00LinuxphiliaTechnology and the Linux benefitAnonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.comBlogger19125tag:blogger.com,1999:blog-8335314075484393906.post-84835696065584391852012-06-13T10:48:00.000-05:002013-05-31T11:34:14.868-05:00Setting up a network-based, centralized home directory as a service<p>As I mentioned in my <a href="http://linuxphilia.blogspot.com/2012/06/my-wife-and-i-like-to-read-books-on-go.html">last post</a>, I just built out a dedicated server at home, and I'm migrating a bunch of services to it. One reason I build this server in the first place was so that I could set up roaming home directories for all my Linux PCs in the house.</p>
<p>I tried a few configurations before landing on the one described here. I had problems where Ubuntu 12.04 would fail to boot if the system failed to mount the network drive properly when the mount was in the <code>fstab</code>. Ubuntu 10.04 would error on boot, but give you the opportunity to skip the mount and continue booting. However, if the share was mounted to <code>/home</code> on the client machine, logging into a desktop environment would fail on account of none of the config files being available. Even when the mount worked properly, having it mounted at <code>/home</code> caused performance problems. Relying on the network for processes like booting and logging into a desktop ended up being a deal-breaker for me having automounted network home directories.</p>
<p>But I did settle on a solution that works well. In brief, I servicized the mounting and unmounting of the network drive, and the control script I wrote for it also places <code>netdrive</code> symbolic links in each non-system user's home directory if a directory on the share exists in their name. Here's how it's done.</p>
<p>There are two parts to this configuration - a server and some clients. Let's set up the server.</p>
<p>My server is running Debian Squeeze, but these instructions should translate well to other distributions. First, install the NFS server:</p>
<pre><code>sudo apt-get install nfs-kernel-server
</code></pre>
<p>Then set the config to share the <code>/home</code> directory over the network. Add this to the end of <code>/etc/exports</code>:</p>
<pre><code>/home 192.168.0.0/255.255.255.0(rw,sync,fsid=0,no_subtree_check)
</code></pre>
<p>You should modify that to suit your specific needs. <code>/home</code> is the directory on the server to share. <code>192.168.0.0/255.255.255.0</code> means that the only systems who are allowed to access it will be on the <code>192.168.0.x</code> subnet, which, for me, means everything behind my router. This share won't be accessible to anyone on the public side of my router. Finally, the <code>(rw,sync,fsid=0,no_subtree_check)</code> part are options that say, respectively, "Mount it read/write so users can save files here," "Write files to disk synchronously so there aren't sudden unmounting problems," "If the NFS server is NFS4, treat this directory as the root of all shared directories," and "Speed things up by allowing all subdirectories of the share to be accessed."</p>
<p>Restart the NFS server:</p>
<pre><code>sudo service nfs-kernel-server restart
</code></pre>
<p>If you encounter problems, check <code>/var/log/daemon.log</code> for something to Google with.</p>
<p>I had firewall problems here. Mounting an NFS share on a remote server uses lots of different ports, and I was unable to identify them all. I ended up adding a rule to my firewall to let anything behind my router (that <code>192.168.0.x</code> subnet) access any ports on the server. For me, this isn't a security issue. You should consider your situation and make that decision for yourself. At any rate, here's my command to make that firewall rule:</p>
<pre><code>sudo ufw allow from 192.168.0.0/24 to any
</code></pre>
<p>Now get onto a client machine and try to mount it manually just to be sure it works:</p>
<pre><code>sudo mount -t nfs -o proto=tcp,port=2049 192.168.0.100:/home /mnt
</code></pre>
<p>If you get no errors, try...</p>
<pre><code>ls /mnt
</code></pre>
<p>...and check if what you see is what you expect. If so, unmount it:</p>
<pre><code>sudo umount /mnt
</code></pre>
<p>Assuming that everything works, it's time to set up a control script. The script below is hardcoded to mount the NFS share to <code>/mnt/netdrive</code>. You should probably read through it and understand it before running it to be sure it won't interfere with things on your system. If it's going to work for you, paste it into <code>/etc/init.d/netdrive</code> or wherever else you decide.</p>
<pre><code>safe_start () {
if [ ! -d /mnt/netdrive ]; then
mkdir /mnt/netdrive
fi
if grep -qs '/mnt/netdrive' /proc/mounts; then
echo "Something is already mounted at /mnt/netdrive [ FAIL ]"
exit 1
else
mount -t nfs -o proto=tcp,port=2049 192.168.0.100:/home /mnt/netdrive
fi
for u in `grep '/home' /etc/passwd | cut -d: -f1`; do
if [ -d /mnt/netdrive/$u ]; then
HOME=`grep "$u" /etc/passwd | cut -d: -f6`
if [ -e $HOME/netdrive ]; then
rm $HOME/netdrive
fi
ln -s /mnt/netdrive/$u $HOME/netdrive
fi
done
}
safe_unmount () {
FILES_IN_USE=`lsof | grep '/mnt/netdrive' | awk '{print $9}'`
if [ ! "$FILES_IN_USE" = "" ]; then
echo "The following files are located on the netdrive and are still in use:"
echo $FILES_IN_USE
exit 1
fi
unmount
}
unmount () {
umount /mnt/netdrive
for u in `ls /home`; do
if [ -d /home/$u/netdrive ]; then
rm $u/netdrive;
fi
done
}
show_status () {
if grep -qs '/mnt/netdrive' /proc/mounts; then
echo "Netdrive is mounted."
else
echo "Netdrive is not mounted."
fi
}
show_help () {
echo "This script accepts the following commands: start, stop, forcestop, status, help"
}
case $1 in
"start" )
safe_start ;;
"stop" )
safe_unmount ;;
"forcestop" )
unmount ;;
"status" )
show_status ;;
* )
show_help ;;
esac
exit 0
</code></pre>
<p>Apply permissions:</p>
<pre><code>sudo chmod 755 /etc/init.d/netdrive
</code></pre>
<p>Now you can run...</p>
<pre><code>sudo /etc/init.d/netdrive start
</code></pre>
<p>...to mount it, or give it <code>stop</code> instead to stop it safely. The script will refuse to unmount the drive if it has files open still. If you want to override that check, use <code>forcestop</code>. <code>status</code> will tell you if the network drive is mounted or not. <code>help</code> will give you a basic usage message. When mounting, this script will also place the <code>netdrive</code> symlink in users' home directories, so, for example, <code>/home/ryan/netdrive</code> is mapped to <code>/mnt/netdrive/ryan</code>.</p>
<p>Now, I still wanted my drive to mount on boot. So I threw a symlink in the startup:</p>
<pre><code>sudo ln -s /etc/init.d/netdrive /etc/rc2.d/S10netdrive
</code></pre>
<p>That's my Ubuntu 12.04 machine. I can't remember at what point in the startup procedure I placed in on my Ubuntu 10.04 box, but it's somewhat discretionary anyway.</p>
<p>So reboot your client and make sure you boot with a working <code>netdrive</code> symlink in your home directory. And done!</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com84tag:blogger.com,1999:blog-8335314075484393906.post-19355663397792948742012-06-11T14:45:00.001-05:002012-06-13T08:53:22.728-05:00Setting up a headless Calibre server<p>My wife and I like to read books on the go, and we use an Android app called <a href="http://www.aldiko.com/" target="_blank">Aldiko</a> (it's great) to read epubs. I have loads of epubs on my computer, and I have traditionally used <a href="http://calibre-ebook.com/" target="_blank">Calibre</a>'s server feature to share them with our phones. In the past, I've kind of clobbered together things on an Ubuntu box that also served as my main desktop computer. That worked well in the past, but I've recently set up a headless Debian Squeeze server, and I decided to migrate my Calibre
server there. I had some setbacks, and thought it would be worth it to document the process here, and how I got things to work successfully.</p>
<p>My server runs headless, and that's the first problem with Calibre. The version of it in the Debian Squeeze repositories doesn't have a standalone server mode that is configurable without using the graphical UI. I can't have that, so I decided to get newer binaries installed. Ordinarily, I wouldn't recommend this, especially on a system like Debian whose community prides itself on the stability of their software packages, but this is a functional necessity, so I'll take it. This command, ripped straight from the <a href="http://calibre-ebook.com/download_linux" target="_blank">official Calibre website</a>, worked for me:</p>
<pre><code>sudo python -c "import sys; py3 = sys.version_info[0] > 2; u = __import__('urllib.request' if py3 else 'urllib', fromlist=1); exec(u.urlopen('http://status.calibre-ebook.com/linux_installer').read()); main()"
</code></pre>
<p>When prompted, I told it to install to ''/opt/calibre'' so it wouldn't conflict with any system libs or binaries.</p>
<p>That means that ''/opt/calibre/calibre-server'' is the standalone daemon for serving e-books.</p>
<p>I wanted to servicize it, so I wrote this init script and placed it at ''/etc/init.d/calibre-server'':</p>
<pre><code>#!/bin/bash
CALIBRE_LIBRARY_PATH="/home/shared/Calibre Library"
PIDFILE=/tmp/calibre-server.pid
USER=calibre
PORT=8081
start() {
echo "Starting Calibre server..."
su -c "calibre-server --with-library=\"$CALIBRE_LIBRARY_PATH\" -p $PORT --pidfile=$PIDFILE --daemonize" &
if [ $? -ne 0 ]; then
echo "Could not start calibre-server."
fi
}
stop() {
echo "Stopping Calibre server..."
if [ -e $PIDFILE ]; then
read PID < $PIDFILE
ps aux | grep "$PID" | grep 'calibre-server' > /dev/null
RUNNING=$?
if [ $RUNNING -eq 0 ]; then
kill $PID
if [ $? -eq 0 ]; then
rm $PIDFILE
fi
else
echo "Could not find a calibre-server process with PID $PID."
fi
else
echo "Could not find pidfile: $PIDFILE"
fi
}
restart() {
stop
start
}
status() {
if [ -e $PIDFILE ]; then
read PID < $PIDFILE
echo "calibre-server is running with PID $PID."
else
echo "calibre-server is not running."
fi
}
unknown() {
echo "Unrecognized command: $1"
echo "Try one of the following: (start|stop|restart|status)"
}
case $1 in
start )
start
;;
stop )
stop
;;
restart )
restart
;;
status )
status
;;
* )
unknown
;;
esac
</code></pre>
<p>You can change the variables at the top to run the server differently. Once this is given execute permissions, you can start the server with:</p>
<pre><code>/etc/init.d/calibre-server start
</code></pre>
<p>And stopped with</p>
<pre><code>/etc/init.d/calibre-server stop
</code></pre>
<p>It looks like the ''service'' command works, too:</p>
<pre><code>service calibre-server start
</code></pre>
<p>I know there are some other problems I need to work out (how to import new books, for example), but this seems like a good start. Anyone have any tweaks or additions to note?</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com19tag:blogger.com,1999:blog-8335314075484393906.post-60358509881075104382011-09-21T13:16:00.006-05:002011-09-28T11:21:50.820-05:00Apache segfaults when PHP runs LDAP functions<p>I've been spending some time at work compiling a web stack to update the old, unsupported, and unmaintained Solaris webstack/coolstack software. It consists of the latest stable releases of Apache's httpd, MySQL, PHP, Python, and phpMyAdmin, compiled from source, to include all of their dependencies. One specific requirement here is that the various LDAP modules work. Many of our customers use this to authenticate users against a master directory on the network. I've encountered a problem.</p>
<p>I have a test script that pulls my own user information from our LDAP server. If I run this through the CLI PHP parser, it successfully and correctly gathers my information. If I hit that page up through Apache, httpd throws a segmentation fault. Even if I set Apache's LogLevel directive to 'debug', I get no information aobut *why* it segfaults. The only relevant output is as follows:</p>
<p><tt>[Wed Sep 21 13:23:06 2011] [notice] child pid 6928 exit signal Segmentation fault (11)
[Wed Sep 21 13:23:06 2011] [info] removed PID file /opt/utsawebstack/apache2/logs/httpd.pid (pid=6923)
[Wed Sep 21 13:23:06 2011] [notice] caught SIGTERM, shutting down</tt></p>
<p>The segfault *only* happens when LDAP functions are called from PHP. ldap_connect(), for example, fails. Other pages render fine through Apache.</p>
<p>Google hasn't been much help. The closest I've come to a solution is a patch for PHP version 4.4.4 — a version of PHP known to be buggy and unusable — and the patch no longer applies.</p>
<p>For the sake of detail, here are the configure strings for the relevant software (taken from a working build script, hence the variable names):</p>
<p><ul><li>Berkeley DB (a dependency of OpenLDAP):
<li><tt>../dist/configure --prefix=$BUILDBASE/lib</tt></li></li>
<li>OpenLDAP (client only):
<li><tt>./configure --prefix=$BUILDBASE/lib/ --disable-slapd --disable-slurpd</tt></li></li>
<li>httpd:
<li><tt>./configure --prefix=$BUILDBASE/apache2 --with-ldap --enable-authnz-ldap=shared --enable-ldap=shared --enable-headers=shared --enable-mime-magic=shared --enable-proxy=shared --enable-rewrite=shared --enable-mods-shared --enable-ssl=shared --with-z=$BUILDBASE/lib</tt></li></li>
<li>PHP
<li><tt>./configure --prefix=$BUILDBASE/php --with-apxs2=$BUILDBASE/apache2/bin/apxs --with-zlib=$BUILDBASE/lib --with-curl=$BUILDBASE/lib --with-iconv=$BUILDBASE/lib --enable-calendar --with-mysql=mysqlnd --with-mysqli=mysqlnd --enable-sockets --enable-zip --with-pear -with-mcrypt=$BUILDBASE/lib --enable-mbstring --with-ldap=$BUILDBASE/lib</tt></li></li></ul></p>
<p>Level of Difficulty: Solaris 10</p>
<p>Other Things I Know:</p>
<p><ul><li>It's probably not a compiler problem. I'm using the latest version of make to do all of the compiling, and it's using GCC 3.4.3. And besides, most of this code requires GCC. Using the Sun/Oracle compiler would likely cause problems and failure.</li>
<li>It's not an architecture thing. I have compiled this and reproduced the problem on i386 and sparc hardware.</li>
<li>Since PHP and Apache are both compiled against the version of OpenLDAP that gets compiled first (see config string above), both are heavily reliant upon the ldap.conf file associated with it. Perhaps this is obvious to most, but I had wrongly assumed for some reason that Apache was the only thing using that. For what it's worth, that file is properly written.</li></ul></p>
<p>Who has any idea what's going on here? Anybody?</p>
<h3>FOUND A SOLUTION</h3>
<p>The problem basically boils down to this:</p>
<p>I wrongly assumed that the --with-ldap configure argument would accept a directory to look in for the OpenLDAP libs (f/ex, --with-ldap=/opt/webstack/lib). It does not. Instead, I needed to throw in the --with-ldap argument with no directory attached to it along with --with-ldap-libs=/my/openldap/lib/dir and --with-ldap-include=/my/openldap/include/dir. Problem solved.Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com2tag:blogger.com,1999:blog-8335314075484393906.post-20275617853940474632011-01-23T12:43:00.004-06:002011-01-24T08:17:43.101-06:00Setting up public key authentication for SSH<p>If you're like me, you remote into a handful of servers using SSH all the time. The process is fairly simple:</p>
<ol>
<li>Get to a terminal</li>
<li>ssh username@hostname</li>
<li>Type password</li>
<li>Get to work</li>
</ol>
<p>No, it's not terribly difficult, but when you have to type that password fifty times per day, you begin to realize that it's time-consuming and repetitive. And there happens to be a way to eliminate that step from the process.</p>
<p>The SSH protocol supports authentication by public keys, and setting this up is a trivial matter. The configuration process goes something like this:</p>
<ol>
<li>Generate a key for your client system</li>
<li>Put it on the server</li>
</ol>
<p>One prerequisite: your SSH server must have public key authentication enabled. This is usually the case by default, but if you want to check, you can look in your /etc/ssh/sshd_config file. Try this:</p>
<p class="code">grep 'PubkeyAuth' /etc/ssh/sshd_config</p>
<p>If the output is</p>
<p class="code">PubkeyAuthentication yes</p>
<p>then you're safe to continue. Otherwise, you'll have to make the change in that file and restart the SSH server. There are a few ways to accomplish this, and I won't go into them here because it's not the point.</p>
<p>The point is that once this is ready, you can create your RSA key on the client machine. This is the first step toward getting this done. On the client machine, run</p>
<p class="code">ssh-keygen -t rsa</p>
<p>You'll be asked for a filename. Just press enter to accept the default, which is probably ~/.ssh/id_rsa</p>
<p>You'll also be asked for a passphrase. You can use this optionally. If the point is to eliminate having to enter a password with every SSH connection, it's best to supply no passphrase. You'll also be asked to confirm it.</p>
<p>When done, you'll get a printout of your fingerprint and you'll return to a prompt.</p>
<p>Now we need to check this file's permissions. We don't want any other users to be able to read this pubkey file lest they compromise your authentication.</p>
<p class="code">chmod 600 ~/.ssh/id_rsa.pub</p>
<p><b>EDIT:</b> Commentor FKereki rightly points out a simpler and better way to accomplish the publication of your public key to your server. You can (and should) run:</p>
<p class="code">ssh-copy-id username@hostname</p>
<p>This command makes sure that the pubkey is added to the appropriate file, and it makes sure nothing gets lost. This is the best possible way to accomplish this task if the command is available to you. However, should you be working in a system where this isn't available (such as a Solaris environment), you can use the following steps to make things work.</p>
<p>Copy your public key to the server as a specific filename.</p>
<p class="code">scp ~/.ssh/id_rsa.pub username@server:~/.ssh/authorized_keys</p>
<p>Make sure the authorized_keys file has the same permissions so it won't be compromised.</p>
<p>The next time you log in from this client machine, you won't be asked for your password.</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com4tag:blogger.com,1999:blog-8335314075484393906.post-74038480840177138202010-08-04T19:18:00.002-05:002010-08-04T19:57:34.650-05:00Techville: Wireless Windows Woes<p>I've never been a with-the-grain type of tech. This, I suspect, is because I tend to be a hacker in the original sense of the word. I want things to be efficient, even if that makes things a little ugly. Being a hacker, however, I usually manage to make things both functional and attractive. And speedy.</p>
<p>Also, I suspect it is because most people who come into IT jobs come straight out of a college education and into a corporate world, and both college and corporate are ruled by Windows. To many of these people, I suspect that a 20 GB operating system with no default usable software is acceptable, and I guess they think a fifteen minute boot time is worth the wait.</p>
<p>Fifteen minutes? Exaggeration?</p>
<p>Hardly. Not in this case.</p>
<p>See, we're all issued netbooks at my place of employment, and for good reason. We cover a lot of physical ground, and are often away from the office fixing computers for hours at a stretch. We need to update our trouble tickets in the meantime, so we bring along our netbooks with solid state drives (to improve mobility and decrease hardware damage from jostling) and update tickets while connected to the ubiquitous wireless network.</p>
<p>There are rules applied to the use of these netbooks, mostly surrounding security, and there's an image that gets blasted onto these machines that contains the OS (Windows XP SP2), a handful of useless software (does anybody still use iTunes 6, and what purpose does that hold at work if it's too old for iPhones?), and SafeBoot (disk encryption with some remote password recovery software). All of this software presumably meets the requirements of the security department, but it all conspired against me because (and I cannot stress this enough) I hate bloat.</p>
<p>Furthermore, wireless didn't work. This is a Dell Inspiron Mini 10 with a Broadcom wireless chipset in it, and I tried everything. I enabled and disabled services. I switched which applications were in control of the hardware. I fiddled with the wireless switch (which is Fn+F2 on this device). Nothing worked. In the end, every single program (including the Dell WLAN management software) informed me that although the hardware was present, recognized, had the proper drivers, and was working, it was detecting no wireless networks in range.</p>
<p>Every computer around me, including the expensive, oversized paperweight they call an iMac was detecting the aforementioned ubiquitous wireless networks (there are two). So why wouldn't my netbook see them?</p>
<p>I'd been planning on tartsenefeding my computer anyway (which is like defenestrating, but backward; I'm not throwing the device out of a window, I'm throwing Windows out of the device) because of my problem with the bloated OS image, but I decided to use this as a troubleshooting opportunity as well.</p>
<p>I put together a bootable flash drive with Ubuntu 10.04 on it and a 1 GB persistence file, and booted to it. I installed the Broadcom STA Wireless driver and rebooted. Thanks to the persistence file, the driver remained intact upon second boot, and I immediately had access to four wireless networks. For those not counting, that's two more than every other PC in the room saw. And for those wondering, both boots and the driver install took less time than it took to boot the native Windows image once.</p>
<p>Thanks to Ubuntu, my netbook runs faster and more stably than anybody else's in the office, and the hacker in me is satisfied without my ever having to hack anything. I realize, too, that Ubuntu isn't exactly a lightweight distro, and that Puppy (yes, I know it's really Ubuntu) or Knoppix or even a custom-built Slackware would almost certainly run faster and more stably. But considering the convenience factor of getting Ubuntu installed (20-30 minutes), its built-in support for encrypted file systems (a mandate for these netbooks), and its overall great appearance and performance, it's probably the best distro for this netbook at the moment.</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com10tag:blogger.com,1999:blog-8335314075484393906.post-89081241444609483082010-03-10T09:14:00.001-06:002010-03-10T09:14:38.221-06:00How to almost get Netflix Watch Instantly to work in Linux<p>I can make Netflix Watch Instantly work on my Linux media center.</p>
<p>Almost. Yes, almost. No, I don't have it working. But I thought that with a little help from the community along with instructions on how I've gotten this far might help bring some support to the topic.</p>
<p><b>Here's how I did it.</b></p>
<ol>
<li>Start with your favorite Linux distribution and <a href="http://www.winehq.org/download/" target="_blank">install the latest version of WINE.</a> I have tried on version 1.1.40 with near success.</li>
<li>Download the <a href="http://www.mozilla.com/en-US/firefox/all.html" target="_blank">latest version of Firefox for Windows</a> and install it into your WINE instance.</li>
<li>Launch the WINE instance of Firefox and browse to <a href="about:config" target="_blank">about:config</a></li>
<li>Get past the warning screen, then right-click in the main region. Click New → String. You'll be given two dialogues:</li>
<li>In the first dialogue, enter <tt>general.useragent.override</tt></li>
<li>In the second dialogue, enter <tt>Mozilla/5.0 (X11; U; Windows NT 6.0; en-US; rv:1.9.2) Gecko/20100115 Firefox/3.6</tt></li>
<li>Close Firefox and relaunch it.</li>
<li>Browse to <a href="http://www.netflix.com" target="_blank">netflix.com</a>, log in, and go to the watch instantly page. Magically, you'll be transported beyond the "OS not supported" page you're used to, and you'll be given the option to install the Silverlight plugin. Do so.</li>
</ol>
<p>It's about here that you'll notice a crash. Firefox in WINE doesn't like it when the Silverlight plugin tries to install and fails. When you repeat step 8, you'll notice that you actually get to the page where the video ordinarily plays, but the image is scrambled and Firefox soon crashes.</p>
<p>So here's the part where I ask for some support from the amazing Linux community. Does anybody have any suggestions as to where to go from here? I'm not sure what parts of the user agent string are being looked at, but it seems more like the Silverlight plugin for Firefox just doesn't work in WINE. How can we go about making Silverlight work?</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com144tag:blogger.com,1999:blog-8335314075484393906.post-66515487983215058972009-09-10T12:04:00.001-05:002009-09-10T12:05:20.092-05:00Please bear with me<p>I recently changed a setting that has borked many of my previous posts in that a lot of the whitespace has been eliminated. I'll fix this when I get around to it, but it's a lot of work.Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com1tag:blogger.com,1999:blog-8335314075484393906.post-31313721343546555592009-09-10T09:47:00.003-05:002009-09-10T19:11:10.546-05:00Eight Things Windows Needs Before I'll Contemplate Using it Again<p>Windows 7 is better than Vista. Great. But saying that is like saying you'd rather catch the common cold instead of swine flu. I've demoed the release candidate for Windows 7, and I can safely say that I still don't like it. Aside from the default options being obnoxious and hard to use (the icons for running applications are identical to the directly-adjacent Quick Launch icons; running programs have no text to show you what they are; unless you have the hardware to back up the Aero interface, you can't get the window previews to help you, either), there are several things I need to see in a Windows operating system before I'll even contemplate switching back.</p>
<p><ol>
<li><b>Multiple virtual desktops</b> — Windows is pretty much the only significant operating system that does not support this. Mac OS X's desktops may not be implemented very well, but they're there all the same. My cell phone has multiple desktops. Why can't Windows get with the program on this? It's an invaluable feature which reduces clutter. I think you'll find that clutter reduction is centric to many of my needs.</li>
<li><b>Application organization</b> — When I click on the Start Menu in Windows, I have a list of programs to sort through which aren't even alphabetized until I tell them to be. The list is huge, presenting me with a different "folder" for each program I have installed. When I have to go looking for a program, I want to be able to look in one of these "folders" that tells me what <i>type</i> of program it is. Is it an Internet program? A productivity program? Is it a minor accessory? One of my programming applications? Keeping this kind of organization to programs keeps the list short, which would be a blessing considering the tiny, half-height, scrolling list of applications which contains six times as many programs as will fit in its frame. Microsoft tried implementing something like this with games when they launched Vista, but that doesn't work automatically for everything because it's layered on top of the existing system, not integrated as part of the system. The way they implemented it required you to open a new window just to see your shortcuts. First of all, that's counterintuitive. Secondly, it clutters my desktop.</li>
<li><b>Useful window management</b> — In Linux, I can click and drag windows across my multiple desktops by dragging to the edge of the screen in the appropriate direction. I can move a window by holding the Alt key and clicking and dragging anywhere at all on the window. I can move a window to the best location and resize it so that it's as big as it can get without overlapping any windows it wasn't already overlapping at a single keypress (<a href="#video">see video below</a>). In Windows 7, they have added some window management features where the movement of a window to an edge of the screen resizes the window to fill half of the screen along that edge. Whoopee. What if I don't want it at exactly half size? What if I just want my window on the right-hand side of the screen? There's no customization here, only an assumption that I want my windows to be exactly where Microsoft wants them. I sincerely hope this feature can be turned off.</li>
<li><b>Installation across drives</b> — As it stands, I get a tiny speed boost and a major OS installation advantage by being able to install my home directory on a different drive or partition than the rest of my OS. This is great for home users because it means they can reinstall the operating system without damaging any of their personal data or application settings. It's also great for server users because MySQL databases can sit on a RAW partition, which is often faster because they don't have to follow the rules of the filesystem that way. The best I can manage in Windows is to create a separate partition and manually save and copy files to that partition after the fact. Nothing will be automatic, and I will have a large separation in functionality between the two. Unlike Unix OSes, Windows does not mount all filesystems fluidly together.</li>
<li><b>Security built in</b> — With Vista, Microsoft attached "User Account Control" to Windows, and that turned out to be a major annoyance that did little to aid security. It prevented nearly every program from running because Windows required administrative privileges to run nearly every program. When all users have instant administrative control, that's a bad thing, and a security problem. That's why they pushed UAC through. But UAC popped up for everything, and most users just turned it off so they could be allowed to use their computer. Again, this is a bad thing, creating even more of a security problem. With Windows 7, not much has changed. Users can now select how many UAC warnings they receive. What will be the effect of this? Just like last time, users will either be annoyed or turn it off. Still a bad thing. Still a security problem. When Microsoft manages to write an OS that has security layered into its core, when they can sort out what should and should not require administrative privileges, they might have a chance at winning me over.</li>
<li><b>Fragmentation-free file system</b> — I don't want to have to spend hours every month defragging my harddrive and slowing my computer to a crawl because my operating system allows fragmentation to happen. I certainly don't want my computer to do this in the background on a schedule that I'm unaware of, slowing my computer down when I need to use it. Mac and Linux do not allow for this to happen. A defragging program is not the proper solution. The NTFS file system is about a decade old now. It's no longer "New Technology." I never wanted to work for my computer in the first place, and it's time to ditch this abhorrable system.</li>
<li><b>Singular application installer and updater</b> — In Windows, when I want software, I go to the Internet and either search Google or go to a website that I know carries that software. I install it using a six-page installation wizard that probably only needs to be a one-pager. I install software one program at a time. And then a week later, when the fifteen programs I took the time to install last week have been updated, I have to either download the software from the individual websites again and then reinstall them all separately through more wizards, or I must run fifteen separate updater programs in the background constantly, just waiting there for an update to happen. Neither of these are viable options. Linux uses central, customizable repositories to pipe software through a single, centralized installer/updater/uninstaller program that allows me to install, update, and uninstall as many programs as I want simultaneously, and in one fell swoop. Again, even my cell phone does this. And as with the fragmentation problem, Microsoft should fix the core problem instead of adding on layer after layer of faux-solution to bandage it.</li>
<li><b>Let me customize!</b> — I don't want the ugly Aero interface, and I don't want the even uglier atrocity that I get when I turn off Aero. I want something that I like and that I choose. I want my colors in front of me. I want my style, my appearance, my everything. Please, Microsoft, let me do this without paying for <a href="http://www.stardock.com/" target="about:blank">third-party software</a> that only adds a separate layer to the problem. The software linked above uses the Windows API to accomplish this, which means that the functionality exists deep within the Windows system files. If Stardock can do it, so can you. You can implement it straight into the software. <i>Do it, already!</i> Let me use my computer the way I want to.</li>
</ol></p>
<a name="video">
<object width="711" height="400"><param name="movie" value="http://www.youtube.com/v/35M2sVYRVuY&hl=en&fs=1&"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/35M2sVYRVuY&hl=en&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="711" height="400"></embed></object></a>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com19tag:blogger.com,1999:blog-8335314075484393906.post-60971112241393510422009-07-25T14:22:00.005-05:002009-09-11T11:46:16.485-05:00Internet Explorer 8 provides best web browsing experience<p>Well, that's what this commercial seems to tell us:</p>
<center><object width="560" height="340"><param name="movie" value="http://www.youtube.com/v/2aA_PEltVTw&hl=en&fs=1&"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/2aA_PEltVTw&hl=en&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="560" height="340"></embed></object></center>
<p>Their claims? That my slow browser is probably "several generations old." They tell me that IE8 is "a huge improvement on the speed scale."</p>
<p>These statements are true if I ordinarily run Mosaic. Of course, IE8 is one more generation of the same old software, and to say that IE8 is faster than IE7 is kinda like saying that Windows 7 will be faster than Vista. They'd have to try hard to make it the other way around.</p>
<p>Of course, they completely fail to mention that IE8 fails almost as hard as IE7 when it comes to meeting web standards. Finally, Microsoft managed to turn this:</p>
<center><img src="http://www.sciactive.com/main/images/stories/acidtests/renders/IE7/IE7Acid2Thumb.jpg"></center>
<p>...into this:</p>
<center><img src="http://www.sciactive.com/main/images/stories/acidtests/renders/IE8/IE8Acid2Thumb.jpg"></center>
<p>Congratulations, Microsoft. You finally passed the Acid2 test. But every other browser's been breezing through for quite some time now. Find something else to impress me with.</p>
<p>And how about Acid3? Here's the debacle we know as IE7's rendering:</p>
<center><img src="http://www.sciactive.com/main/images/stories/acidtests/renders/IE7/IE7Acid3Thumb.jpg"></center>
<p>And here's the major improvement shown by IE8:</p>
<center><img src="http://www.sciactive.com/main/images/stories/acidtests/renders/IE8/IE8Acid3Thumb.jpg"></center>
<p>Wow! Now it says, "FAIL" in giant letters just to let you know that it does, in fact, fail. It probably says that in the IE7 rendering as well, but it's difficult to tell what with all the mangled distortion of crap way up there.</p>
<p>This test takes about two to three seconds to run in Firefox and Chrome (with Chrome running about half a second faster). In Internet Explorer, it's about nine seconds. How's that for fast!</p>
<p>One generation older and seven seconds slower.</p>
<p>On an unrelated note, after Microsoft's failed ad campaign starring Jerry Seinfeld, you'd think that other has-been performers from the 90's would realize that perhaps Microsoft commercials are not the best way to make a career comeback.</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com12tag:blogger.com,1999:blog-8335314075484393906.post-56904287519433987742009-07-25T13:54:00.012-05:002009-09-11T11:50:24.244-05:00GoboLinux review<p><a href="http://www.gobolinux.org/" target="about:blank">GoboLinux</a> is a Linux distribution I heard about from a friend who said that it looked interesting for its flagship property - a simpler file structure. I decided to check it out.</p>
<p>I downloaded the distribution ISO from their <a href="http://www.gobolinux.org/?page=downloads" target="about:blank">website</a>, which was easy enough, and booted up VirtualBox with that ISO mounted as a drive.</p>
<p><b>Installation</b></p>
<p>Installation was simple enough for an intermediate user to do. Instead of Ubiquity or another similar installer, the Gobo folks have opted for a custom piece of software.</p>
<p>I booted to the live CD's desktop environment. You must manually run startx to kick on the X server from the live CD. This is a good and bad thing, depending on who you are. I believe that this Linux is designed for mid-to-high levels of Linux experience, so if you have that, this should be no big deal, especially since the first output tells you exactly what to type to get things done. For novices, this would be confusing and unappealing, though not insurmountable. I doubt that newbies are the target audience for Gobo.</p>
<p>I had to boot to the desktop environment because the command line installer will not run without having already partitioned your drive. While I'm sure there was a command line partitioner available, I am not familiar with any of them, so I opted for the desktop install. Even from that environment, ease of install is hindered by the fact that the installer errors out if your drive is not already partitioned. Partitioning must be done separately, then the installer can be run.</p>
<p>The installer itself is easy to understand, if a bit unsightly. Again, this won't be attracting novice users, but anybody who knows a bit about computers will be able to figure it out in no time.</p>
<p>Installation was quick, even for the full package. My VM took care of it in 10 minutes or so on low-grade hardware, once I got through the install configuration sections.</p>
<p>Trying to reboot from the desktop environment was not possible without using the terminal. I exited my session and was spat out into a terminal still executing from the live CD image. I gave the reboot command and finally got to boot into the OS, ready for action.</p>
<p><b>First Boot</b></p>
<p>The bootloader allows for three options, as shown here:</p>
<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfWQRF2YtgTyelt2B3FkqNe3rOOqL8YYEw7-qOCdvsYahaKKiQTgq3KGzpuqA1_J0Ks-R38xyn3D8xUNi0j9-pgPK7jI_fMwWfbL6srt3EwVVupw8amTXUVQki_PHdU1Cqz7aBj-N84ev-/s1600-h/gobo-01.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 241px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfWQRF2YtgTyelt2B3FkqNe3rOOqL8YYEw7-qOCdvsYahaKKiQTgq3KGzpuqA1_J0Ks-R38xyn3D8xUNi0j9-pgPK7jI_fMwWfbL6srt3EwVVupw8amTXUVQki_PHdU1Cqz7aBj-N84ev-/s320/gobo-01.jpg" alt="" id="BLOGGER_PHOTO_ID_5362473334686002274" border="0" /></a>
<p>So far, I have only booted to the Graphic Desktop option, but it's nice to have those other options available. Yet another sign that this is not for your average Joe the Plumber, this screen requires the user to make a selection before it will boot. I waited for quite some time, and it never timed out.</p>
<p>The boot is noisy until it hits runlevel 2, then it starts a slightly quieter boot process.</p>
<p><b>Desktop Environment</b></p>
<p>The default desktop environment for Gobo is KDE 3.5. I've never personally been a fan of KDE, but I could install Gnome on here if I wanted to.</p>
<p>The login process is more or less the same as any other KDE login:</p>
<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQaW43IeGB92eScDragqkUPgz_IlYaZOCDvFY24TLxoPrD8syxgJE09A9ynS7HWqZmyagoZJ9pQxROWOha3tMbxdIkcy2uFHQofZTnjvV9QnRYfbwyyYv2vQbV7QyOlMMa-M9SEh_WDAKN/s1600-h/gobo-02.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 240px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQaW43IeGB92eScDragqkUPgz_IlYaZOCDvFY24TLxoPrD8syxgJE09A9ynS7HWqZmyagoZJ9pQxROWOha3tMbxdIkcy2uFHQofZTnjvV9QnRYfbwyyYv2vQbV7QyOlMMa-M9SEh_WDAKN/s320/gobo-02.jpg" alt="" id="BLOGGER_PHOTO_ID_5362473504540555170" border="0" /></a><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgp93FyGqHmKAQQtXdFWfkDX5PDxFFIs1fQ91bDSW4Krjy7AkwFg9e3kukckhLZe8vtx-aW430ZBY6tnxW7S7ouZ_ZDu2IP8_eWWqODXSlN7oX2aPlCQbRLP3uJAheu8b1nukpjWMAFbGSy/s1600-h/gobo-03.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 241px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgp93FyGqHmKAQQtXdFWfkDX5PDxFFIs1fQ91bDSW4Krjy7AkwFg9e3kukckhLZe8vtx-aW430ZBY6tnxW7S7ouZ_ZDu2IP8_eWWqODXSlN7oX2aPlCQbRLP3uJAheu8b1nukpjWMAFbGSy/s320/gobo-03.jpg" alt="" id="BLOGGER_PHOTO_ID_5362473560820354386" border="0" /></a>
<p>The default desktop wallpaper is pretty lame, but most Linuxes tend to use their own logo by default. Fortunately, a collection of good photographs and textures are available, and you can, of course, add more as desired.</p>
<p>Three icons greet you, Home, Trash, and Manager. Manager is the applications management frontend for Gobo. It's recipe-based, and is supposed to provide users with a list of available versions of various software packages. Software can be enabled, disabled, and linked (more on linking later). The distro's website contains a decent collection of recipes for various products. It's poorly organized, but it's <a href="http://recipes.gobolinux.org/r/" target="about:blank">there</a> all the same. The search feature helps to find what you're looking for.</p>
<p>Manager itself executes in a terminal, then creates an additional window for the GUI frontend. Naturally, closing the terminal closes the GUI as well. That's a bit unrefined, but not a killer problem.</p>
<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifCc5-4YbUydngt_XQiRFvbDe3AeFz4Zk3uYjz_GKP-y7fbvN8hPfMN59CIiU1Obhk3f7IA_QrH9R_qbfxkEgWjmEW_lCdzAtjEcPnH94IVYdPDxF96OXqkXLM1DW3H3W4ajcKIAH535rm/s1600-h/gobo-04.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 240px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifCc5-4YbUydngt_XQiRFvbDe3AeFz4Zk3uYjz_GKP-y7fbvN8hPfMN59CIiU1Obhk3f7IA_QrH9R_qbfxkEgWjmEW_lCdzAtjEcPnH94IVYdPDxF96OXqkXLM1DW3H3W4ajcKIAH535rm/s320/gobo-04.jpg" alt="" id="BLOGGER_PHOTO_ID_5362473686001682338" border="0" /></a>
<p>The control panel allows you to change your settings well. The standard control panel for KDE is what exists.</p>
<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTO-ryKwJyvinKyL92u8LlM5N-OeN0imedysQhWG_NuD_fAHUgueB1ZoZ7WMGf-WNqs7wOUEpXAf0OeqsC1hXWZOb1E5l5tzCFITwyf8hShrXm4lXhcVnoL9_lDYP4iXY28Dac9uQZWQam/s1600-h/gobo-05.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 240px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTO-ryKwJyvinKyL92u8LlM5N-OeN0imedysQhWG_NuD_fAHUgueB1ZoZ7WMGf-WNqs7wOUEpXAf0OeqsC1hXWZOb1E5l5tzCFITwyf8hShrXm4lXhcVnoL9_lDYP4iXY28Dac9uQZWQam/s320/gobo-05.jpg" alt="" id="BLOGGER_PHOTO_ID_5362473772185517250" border="0" /></a>
<p>As with other KDE-based distros, the default text editor is kate. Text editors are a big thing for me since I develop for the web with hard code like this. If anybody else cares, kate does a decent job with syntax highlighting. It's not as good as gedit in Gnome, but here's a picture of it highlighting most of the JavaScript embedded within HTML fairly well:</p>
<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZJOEE7TmHtPG8hjVqNlbDF7dlWuTEMHY-syN2Tj4KiSXnbXwW-RAmdMQvpt7qYEWuAr5CLkq-YGHT3-oT8fKZCFqg9SiIseJ6_ZOv7VX1iQb_qiYfCtN_jC6GstkWDqz7f2-nPc0r_wJa/s1600-h/gobo-08.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 226px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZJOEE7TmHtPG8hjVqNlbDF7dlWuTEMHY-syN2Tj4KiSXnbXwW-RAmdMQvpt7qYEWuAr5CLkq-YGHT3-oT8fKZCFqg9SiIseJ6_ZOv7VX1iQb_qiYfCtN_jC6GstkWDqz7f2-nPc0r_wJa/s320/gobo-08.jpg" alt="" id="BLOGGER_PHOTO_ID_5362473855257799938" border="0" /></a>
<p>The web browser, as you may have guessed, is Konqueror. Again, I've never been too impressed with KDE and its suite of software, but at least it renders basic HTML, CSS, and JavaScript properly, which is more than Microsoft can say for Internet Explorer.</p>
<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiNg3aSNEm-lB8hPbKQoQOsIHKx_02M2AyXVqtx2iGkq_2W5bdYv8qem84-izAQYKNa6FCMNIPnCKF8ylM_lPKK0oDipvhsaVykcdLcKP958844bvhYMcqo2oWzO6TofThfJdqylsWN0raN/s1600-h/gobo-09.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 226px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiNg3aSNEm-lB8hPbKQoQOsIHKx_02M2AyXVqtx2iGkq_2W5bdYv8qem84-izAQYKNa6FCMNIPnCKF8ylM_lPKK0oDipvhsaVykcdLcKP958844bvhYMcqo2oWzO6TofThfJdqylsWN0raN/s320/gobo-09.jpg" alt="" id="BLOGGER_PHOTO_ID_5362473888531505122" border="0" /></a>
<p><b>Software Installation</b></p>
<p>As mentioned before, Manager is the GUI frontend to Gobo's software installation system. That system is a very good one that solves some problems. There are tons of different ways to install software. Gobo looks at the problem from the perspective that compiling from source is the best way to ascertain the most recent version of software, and addresses it by using a custom program called Compile.</p>
<p>Compile combines the functionality of various other compilation programs like compileprogram, makefile, and xmkmf to always compile source code correctly. It also adapts the file paths of those source compilers to match the specialized file structure of Gobo.</p>
<p>Compile uses recipes to know where to get the source code, what version it is, what the checksum is for download verification, and how the compiler should react to the download. This turns the terminal command for installation into something as simple as "Compile foo" or "Compile bar 1.1" for a specific version. If properly managed, this is a great and easy way to get the job done.</p>
<b>The Flagship</b></p>
<p>So far, Gobo is a pretty mediocre distribution of Linux, but it does one thing extraordinarily well, and the devs flaunt it rightly. The file structure has been completely reworked. This is the root directory:</p>
<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhri-jjvfih0Fsg93Jk8aCi3rHd8oX4IBYxbsbSd0VYFh7TJboY2kRBtpCoyCWeV7r2QzyZDJTvPpzYJYdAdvNEKkGNLhgvVrQfjAAqzmbyTezgc6mJjP2bli_TLLporEUAOi2A7xwG_5r-/s1600-h/gobo-07.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 320px; height: 226px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhri-jjvfih0Fsg93Jk8aCi3rHd8oX4IBYxbsbSd0VYFh7TJboY2kRBtpCoyCWeV7r2QzyZDJTvPpzYJYdAdvNEKkGNLhgvVrQfjAAqzmbyTezgc6mJjP2bli_TLLporEUAOi2A7xwG_5r-/s320/gobo-07.jpg" alt="" id="BLOGGER_PHOTO_ID_5362474017580568706" border="0" /></a>
<p>Seven directories. That's it. All of the POSIX stuff is still there, but only in the sense of symlinks that point to various locations. All the details about how they do it can be found <a href="http://www.gobolinux.org/?page=at_a_glance" target="about:blank">here</a>, but here are the important parts:</p>
<blockquote>"GoboLinux is a modular Linux distribution: it organizes the programs in your system in a new, logical way. Instead of having parts of a program thrown at /usr/bin, other parts at /etc and yet more parts thrown at /usr/share/something/or/another, each program gets its own directory tree, keeping them all neatly separated and allowing you to see everything that's installed in the system and which files belong to which programs in a simple and obvious way."</blockquote>
<p>Here's the breakdown alphabetically:</p>
<ul><li><b>Depot</b> - This is a shared directory for all users. The devs call it a "community area" or an all-users home folder.</li><li><b>Files</b> - This is a shared directory for applications. Files that are needed by multiple programs, but are not necessarily owned by those programs are kept here, such as fonts and wallpapers.</li><li><b>lost+found</b> - This is your standard ext3 filesystem lost+found directory.</li><li><b>Mount</b> - All mountable devices (CD-ROMs, floppies, flash drives) are mounted here.</li><li><b>Programs</b> - Each program gets its own directory here wherein it can install all of its files and folders.</li><li><b>System</b> - Here are the critical system files, the ones that make Gobo work. These include the Linux kernel, boot software, initialization scripts, settings for various software (/etc content, in other words), and all the symlinks that make the new file structure work (and compatible with other Unix-like file structures).</li><li><b>Users</b> - This is the /home replacement, and that's more or less all it is.</li></ul>
<p>Really, the system is quite intuitive. Hisham Muhammad, the ringleader for the GoboLinux development community, makes some very convincing arguments regarding the new file structure on the <a href="http://www.gobolinux.org/?page=doc/articles/clueless" target="about:blank">GoboLinux website</a>. He makes a very good case for change and explains how, despite the deviance from the standard Unix/Linux file architecture, it is still compatible with that old Unix format, if not more so.</p>
<blockquote>"Through a mapping of traditional paths into their GoboLinux counterparts, we transparently retain compatibility with the Unix legacy. There is no rocket science to this: /bin is a link to /System/Links/Executables. And as a matter of fact, so is /usr/bin. And /usr/sbin... all "binaries" directories map to the same place. Amusingly, this makes us even more compatible than some more standard-looking distributions. In GoboLinux, all standard paths work for all files, while other distros may struggle with incompatibilites such as scripts breaking when they refer to /usr/bin/foo when the file is actually in /usr/local/bin/foo."</blockquote>
<p><b>Conclusion</b></p>
<p>After many years of Linux use, I have come to find what I like and do not like about Linux-based operating systems. I want a system that I can install quickly and easily, without much hassle. I love the repository-centric installation and updating of software packages, but I could probably adapt to a recipe-based installation system. I much prefer the Gnome desktop environment to KDE, XFCE, Enlightenment, or any other that I have come across. In these respects, GoboLinux fails to meet most of my expectations of a Linux OS.</p>
<p>But don't let that fool you. This is a perfectly stable OS with a lot of benefits. Most importantly, Mr. Muhammad has a really great idea going with this retooling of the file structure. It's user-friendly, system-friendly, and Unix-friendly. It's simple and clean. It goes a long way toward simplifying Linux for average computer users. While many aspects of the OS are more difficult to use and while I believe the new structure could use some further improvement, Gobo makes a great point with this. Many Linux users will disagree, and that's fine. That's why we have communities and a multitude of distributions. I personally wish that Ubuntu or Fedora would adopt a system like this. It could take those other distros a long way.</p>
<p>I won't be using GoboLinux on a regular basis, but I'll be checking back in on it in a year or two to see how far it's come. This is a distro with major potential.</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com19tag:blogger.com,1999:blog-8335314075484393906.post-78352880662320309082009-07-11T20:18:00.002-05:002009-09-11T11:53:47.526-05:00Seven complaints about Linux and why Windows users make them<p>I, like most Linux advocates, have demonstrated the power of Linux to people who had no idea such power existed in technology, much less in free technology.</p>
<p>I have a friend who was at my house one day while I was at work. For some reason, he needed me to reboot my computer. So I did, via SSH. When I'm at work, I stream music off of my desktop at home for my listening pleasure via the web. Sometimes if I need to do something from my desktop when I'm working, I'll X-forward an application to my work computer and do it that way.</p>
<p>People see all of these things, not to mention the various desktop effects that Compiz provides, and marvel at how awesome and powerful it is. They ask what it is. When I help them set it up, they get frustrated and quit. These are the problems faced by many a Linux demonstrator. The complaints that I hear are consistent. Always.</p>
<p>But I'm not here to complain. I'm here to solve problems. So I will offer a solution to each complaint that I've heard a dozen times as I go.</p>
<p><b>The first complaint:</b></p>
<blockquote>"I put a DVD in my player and it won't play."</blockquote>
<p>This problem, like most of them on this list, is the result of user ignorance brought on by some external force or another. This particular problem comes about when terms like <b>DVD</b> and <b>MP3</b> become marketing buzzwords. Their definitions are reduced to the most distilled, basic descriptions of what they should be because products sell better that way. Here's how these words look in a Best Buy salesman's dictionary:</p>
<p>DVD - <i>n.</i> A movie.</p>
<p>MP3 - <i>n.</i> Music.</p>
<p>But they're both so much more than that. They're patented formats, which makes all the difference. Any time you purchase a DVD or a DVD player or a DVD-ROM drive, anytime you buy any device that plays MP3 files, part of your money spent is paying the owners of the DVD and MP3 license. Some company turns a profit every time you buy a DVD or MP3 player whether or not they manufactured or even designed the device. When you pay a license to Microsoft to purchase Windows Media Center Edition which comes out of the box ready to play DVDs, you're buying the right to watch DVDs on that device.</p>
<p>Most Linuxes are free of cost, but more to the point, they're about freedom of usage. How free are you to play a DVD anywhere? In other words, why should you have to buy the right to watch DVDs that you have either bought from a store, rented, or made yourself? That's not freedom. Which is why most Linuxes won't give you the ability to decode DVDs upon immediate install.</p>
<p>But people shouldn't be discouraged by this. The guys over at <a href="http://www.videolan.org" target="about:blank">VideoLAN</a> have begun distributing a library of code called <a href="http://www.videolan.org/developers/libdvdcss.html" target="about:blank">libdvdcss</a> that decrypts DVDs just fine. And your music, those MP3s and AACs and WMAs and WAVs and whatever else you may have can be listened to by installing the appropriate libraries from the amazing folks developing <a href="http://www.gstreamer.net/" target="about:blank">gstreamer</a>. Installation of these packages is simple enough, either by direct download from those websites or by using whatever repositories your distribution makes available to you.</p>
<p><b>The second complaint:</b></p>
<blockquote>"Some piece of my hardware doesn't work."</blockquote>
<p>This is a more legitimate complaint than the first, but I still must object to the reason why the complaint is valid. Microsoft has a long history of making contracts with hardware developers that say that they cannot write drivers for non-Windows operating systems or even disclose the methodologies of the hardware to non-Microsoft developers in exchange for promotion deals. This effectively prevents drivers for Linux from being made.</p>
<p>Unless, that is, a truly dedicated Linux dev decides to reverse engineer the driver from the activities and data transmissions of the hardware. This is difficult to do, and time consuming. These devs are a benefit to us all.</p>
<p>Even with this understanding that the disposition of Linux hardware drivers is dismal, I still can't validate fully giving up on such a problem. Rarely is a piece of hardware rendered completely unusable due to a driver issue. Forums exist for the sole purpose of spreading functional hardware drivers for Linux.</p>
<p>The main complaint used to be that wireless cards never worked. Back in the day, you might have had to use the Windows driver for the card in Linux via a utility like NDISWrapper or MadWifi. Those days are gone. Ubuntu now has built-in drivers for Intel, Atheros, and Broadcom wireless chipsets. These are automatically detected and installed in roughly three clicks of the mouse. Linux is no longer in the disparity it used to be in, but I don't know that I'd call it disparity anyway. Let's not forget that Linux initially grew and spread when people passed around the OS in its sapling form, writing drivers to match their hardware, creating in the process a massive driver base.</p>
<p><b>The third complaint:</b></p>
<blockquote>"Installing software is too hard"</blockquote>
<p>This is an almost unforgivable combination of a terminology confusion combined with the false belief that Linux is just like Windows, but better.</p>
<p>A Windows user wants to install a program by putting a disk in their computer and waiting for an installer to pop up onscreen. Then they want to go through seven wizard pages, ignoring all settings, just clicking "Next" a bunch of times. Then they want to wait while it installs, then have to dig through thirty unorganized program listings to find the software, then dodge some links to various readme text files and Internet links to corporate sponsorship websites to finally launch their program. And they want to do this for every single program they install. It's simply too much of a hassle to learn that Linux does things <b>differently</b>.</p>
<p>In Linux, you run a program that was installed when you first put the OS on the system. You search there for the software you want. You check a box suggesting that you want this installed. You do this for all programs that you want installed. Then you click an Okay button and wait while all software is automatically downloaded, all dependencies taken into account, then installed and configured to a predetermined best practice set of settings. Then they get organized by type - Internet, Games, etc. - so you don't have to pick through so much, and then they're alphabetized. The process is simpler, faster, and more organized. It's not harder. It's just not what they're used to. Uninstallation is just as easy.</p>
<p><b>The fourth complaint:</b></p>
<blockquote>"Linux is not good for gaming."</blockquote>
<p>Again, we see a confusion of terminology. Linux is actually great for gaming, especially since you don't have so many of your resources tied up in the OS alone. (Side note: generally, I speak of Windows XP here since that's still the version that's most common in the world, but it gets worse with Vista and Windows 7.)</p>
<p>The problem is that Linux isn't popular enough in the world for commercial game companies to bother porting all their games over. It's not financially viable. This, I will admit, is a viable reason to keep a Windows installation around. I myself dual-boot between WinXP and the most recent release of Ubuntu.</p>
<p>That field is changing, however. id Software is porting more and more games to Linux. Some older games are being ported in, which proves that it's doable at the same time that it lays out a process for doing so, a streamline to take care of future ports.</p>
<p>This is not to say that some great games do not already exist for Linux. Take a look at <a href="http://www.savage2.com/en/" target="about:blank">Savage 2</a> or <a href="http://www.sauerbraten.org/" target="about:blank">Sauerbraten</a> or <a href="http://www.alientrap.org/nexuiz/" target="about:blank">Nexuiz</a>. The graphics are phenomenal, the gameplay is great. The only thing these games lack is a storyline, but the tremendous success of the Unreal Tournament series, Team Fortress, and the Quake Tournament games prove that a storyline is not an absolute requirement of a good game. If you're looking for something with a plot, you can check out the <a href="http://www.penumbragame.com/media.php" target="about:blank">Penumbra</a> series of games.</p>
<p>Linux is great for gaming, some might argue that it's actually better for gaming than Windows, but the games are less available. Just clearing that up.</p>
<p><b>The fifth complaint:</b></p>
<blockquote>"I have this program that runs fine in Windows, but it doesn't run at all in Linux."</blockquote>
<p>The people making this claim apparently never looked at the system requirements for the software, and therefore never saw the part where it required Windows. This problem is to be expected. Linux has an entirely different software architecture beneath it because, well, it's a different operating system.</p>
<p>Giving up at this point is a display of short patience and also ignorance to the thought that there might be equivalent software out there for Linux. Need a word processor? Try AbiWord. Need a full productivity suite? Try OpenOffice.org. Need a Visio-style diagramming program? Try Dia.</p>
<p>Do none of those match your specifications? Install WINE and try to install your Windows software that way. The fact that Linux can execute a large chunk of Windows code is more than Windows can stake a claim to.</p>
<p>If all else fails and that software is an absolute necessity, then, yes, there's always Windows. Feel free to use it.</p>
<p><b>The sixth complaint:</b></p>
<blockquote>"I asked how to do something, and they told me to type commands. That's not intuitive!"</blockquote>
<p>Chances are this user was told to do it on the terminal because it was</p>
<ol style='a'><li>The easiest way to describe the solution in a message board or chat room or</li>
<li>The fastest way to get the job done.</li></ol>
<p>Probably, there's a way to do it through the GUI, but they asked how to do something that's been bugging them, so they were shown the fastest method, if perhaps not the method that they would have liked.</p>
<p>Besides, the terminal's not the scariest thing in the world. Running an apt-get command isn't the same as compiling software from source. Being told to use the terminal once, and being told exactly what to type, is not the most complicated thing one can accomplish on a Linux system, and it's not a constant requirement. Most full-time, die-hard Linuxers will use the CLI very regularly. I know I do. That's because we know it's the fastest way to get things done and we've taken the time to learn it. That's not really expected of a first-time user, especially one who isn't technologically inclined. Next time, the user should specifically ask how to do the task using the mouse instead of the keyboard.</p>
<p><b>The seventh complaint:</b></p>
<blockquote>"Linux is not ready for the desktop."</blockquote>
<p>This isn't so much a complaint as it is the sum total culmination of all of the others. People fear new and different things. Human beings don't always react well to change. The natural reaction is to call that change inferior, even against obvious examples of it not being so.</p>
<p>There are several million Linux users worldwide who would disagree with this statement fundamentally, and are proving its wrongness on a day-to-day basis. Millions of people use some form of Linux as their primary desktop OS. In fact, there are millions of people who use Linux every day and don't even realize it. The system is so versatile that it can be tweaked to perform on everything from iPods to Tivos to cell phones. It is backed and assisted by many multi-million dollar corporations like IBM and Google. Believe it or not, Microsoft runs a lot of their web servers with Linux, which goes a long way toward showing their faith in their own product.</p>
<p>But if people give it a try, adjust to changes for the better, and most of all contribute some way to the community that keeps advancing Linux and its derivative software far beyond the scope of Microsoft Windows, it just might work out for them. It already has for millions of users.</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com243tag:blogger.com,1999:blog-8335314075484393906.post-23497340841424348472009-07-09T21:50:00.008-05:002009-09-11T11:58:18.450-05:00Google Chrome OS - Let's be reasonable<p>Two days ago, Google published this on their <a href="http://googleblog.blogspot.com/2009/07/introducing-google-chrome-os.html" target="about:blank">blog</a>:</p>
<blockquote>"We're announcing a new project that's a natural extension of Google Chrome — the Google Chrome Operating System. It's our attempt to re-think what operating systems should be."</blockquote>
<p>Naturally, the tech world goes crazy. I've seen people say such polarized things as "<a href="http://www.semiaccurate.com/2009/07/08/microsoft-hands-victory-chrome-os-v/" target="about:blank">Today marks the knee of the great curve of Microsoft's decline</a>" and "<a href="http://econsultancy.com/blog/4174-much-ado-about-nothing-google-chrome-os" target="about:blank">Google's vision is meaningless to consumers.</a>" I've seen people say that this is just what the world needs to push Linux into the spotlight and spread its market share, but alongside those, I've seen people criticize the proposed operating system as a direct affront to Linux. In truth, the tech world is full of zealots and fanboys who will say pretty much anything to draw attention to their OS/browser/etc. of choice. Let's look at this logically, level-headed.</p>
<p>Before we can begin to decide how this affects the computing community at large, we have to define what it is. That said, here are some basic facts about the project according to what Google published. It is</p>
<ul>
<li>Open-source</li>
<li>Lightweight</li>
<li>Targeted at netbooks</li>
<li>Focused on "speed, simplicity, and security"</li>
<li>Capable of booting and being on the net "in seconds"</li>
<li>Running Linux at the core</li>
<li>Running a new window system (not X)</li>
<li>Running the webkit-based browser Chrome on top of that</li>
<li>Not Android</li>
</ul>
<p>Let's tackle the two obvious questions.</p>
<p><b>What does this mean for Microsoft?</b></p>
<p>Quick answer: who knows? Certainly, Microsoft's long-standing campaign of deceit, condescension, and failure to meet promises is beginning to fail them, especially in the light of Apple's advertisement campaigns in the past few years which have been an unabashed, straightforward attack on Microsoft, even if they were largely a heap of fibs themselves. At the end of 2006, it would appear that Microsoft had roughly 94% of the market depending on how you measure it (and if <a href="http://en.wikipedia.org/wiki/Usage_share_of_desktop_operating_systems" target="about:blank">Wikipedia's sources</a> are accurate), while Mac claimed 5-6% and the various Linux distributions carried less than 1% of the pie. At the end of June 2009, those same sources show various Windows OSes occupying a little less than 88% of the market, Mac's share increasing to roughly 9-11%, with Linux still dangling in at a little bit less than 1% (but still more than its previous "roughly 1%").</p>
<p>Chrome OS fits a niche market: the netbook. These are those (almost annoyingly) small computers that are meant to have only enough processing power to put a person on the Internet and do some basic word processing. They are definitely helpful for the technologically disinclined. A person wanting to actually accomplish something impressive will look elsewhere, but netbooks definitely have a good reason to exist. I'm sure that Chrome OS will run quite nicely there. It's a niche that Microsoft will have a hard time filling, even with Windows 7, which is intended to do just that.</p>
<p><a href="http://windows.microsoft.com/en-us/windows7/products/system-requirements" target="about:blank">Win7 requires 16-20 GB</a> of free space on the hard drive, and that's ignoring any filespace left over for other software. Microsoft Office will then chew up <a href="http://office.microsoft.com/en-us/suites/HA101668651033.aspx#2" target="about:blank">another 1.5 to 3 gigs</a> depending on how much of that software you need. Tack on the high video card requirements and you have a stodgy, slow, and ugly operating system that devours more of the disk than necessary.</p>
<p>Most of the affordable netbooks have much larger disks than that, so the wasted space may not mean as much as it used to. However, the fact that Windows 7 sets up automatic defrags, even on hardware such as solid state drives where the act of defragmentation is both unneccessary and physically damaging, makes it a dangerous operating system to run on a netbook without doing some major tweaking first -- tweaking that the kind of people who create the netbook market probably won't know to do.</p>
<p>So what's worse for Microsoft, really? Is it really Chrome OS that will do them in, or is it Microsoft's own horrendous software? Besides, what Chrome OS proposes to be is something even less than what many Linux distributions already are. Where Ubuntu's most recent bit of awesomeness -- <a href="http://www.ubuntu.com/getubuntu/download-netbook" target="about:blank">Ubuntu Netbook Remix</a> --offers a full desktop experience with a complete productivity suite, an IM client that works across practically all chat protocols, and where other common net-based software is about three clicks away (blogging clients, microblogging clients, casual games, etc.) all in less than 1 GB, Chrome OS will only offer a browser that isn't very extensible yet. The Chrome browser may execute JavaScript faster than anything else right now, but there are tons of standards that it still doesn't support, and a person's web browsing experience will be less "correct" there than with Firefox 3.5 on any OS at all. Windows 7 can't fill any of those bills.</p>
<p>It looks like it <b>could</b> affect Microsoft in a couple of ways, though.</p>
<p>First, it <b>might</b> expose Linux to more people. Don't let me confuse you with that statement and that emphasis. What I mean is that it will succeed. Trust me. Google has enough money to create muscle in the market. It'll be popular to some degree or another, even if it's only within this niche netbook market. But even though it's Linux-based, and even though it's open source, I sincerely doubt that the terms "Linux" and "open source" will have anything to do with their advertisement plan. They will confuse customers that might otherwise be willing to buy into the concept.</p>
<p>Second, Google's tactic of late has been to simply undercut Microsoft's pricing, which is easy to do given the outrageous dollar value Microsoft places on their operating systems. Being that COS will be Linux and therefore open source, undercutting that cost is almost expected. Linux already does this, but has little to no advertisement behind it, and to tell a potential user that there are hundreds of Linux OSes to choose from doesn't exactly help win over the hearts of the masses. COS will provide a solution to both of these problems, and that will definitely help reduce Microsoft's monopolistic, anti-competitive hold on the computing world as a whole.</p>
<p><b>So where does that leave Linux?</b></p>
<p>Linux has, up until recently, been a "geek" thing, and that's a mildly unfair stigma that has only been slightly loosened by more recent, "user-friendly" distros like the Ubuntus, including Linux Mint and other derivatives. This will not change. I keep hearing people make very bold claims that this is either <b>really, really good</b> or <b>really, really bad</b> for Linux on the whole. But one of the founding principles of Linux is freedom of choice. If you wish to not run COS, there is nothing stopping you from using an Ubuntu or a Gentoo or Fedora or Sabayon or PCLOS or any other distribution. Hell, you could even run Windows 7 on it if you so choose. One's decision about what OS to run is exactly that -- one's own decision. The introduction of COS to your miles-long list of options only changes things inasmuch as your list would then become miles long plus one.</p>
<p>In fact, Linux has a chance to improve greatly. Having a popular Linux OS on the market backed by an unbelievably filthy rich corporation provides incentive for hardware manufacturers to release more official and better hardware drivers for the Linux platform. I'm not complaining about the quality of existing drivers, only the lack of drivers for a great deal of hardware. It's one of the main reasons why people quit Linux before they really begin. They can't get audio or their video card's driver can't send X up to the right screen resolution or they really need better Bluetooth support. Granted, Linux has better hardware support than Windows in most cases, at least so far as having it ready at install-time is concerned, but overall it lacks in this department. That's not Linux's fault. You can legitimately blame Microsoft for this. Their long-time practice of colluding with hardware manufacturers has been so deeply entrenched in the technology community that special words have been coined to describe it. Consider "Wintel" for instance, to describe the Brangelina-style marriage of Microsoft to Intel. Having a Linux-based OS in the mainstream could help facilitate a long-needed divorce.</p>
<p>In short, Chrome OS isn't going to change the world. It barely even has a home in the small portion of the market that it intends to occupy. Perhaps it <b>will</b> occupy it, maybe completely. If you consider that the only other <b>major</b> OS suitable for netbooks is not as suitable as Microsoft would lead you to believe, you've got a clear win for Google. But if you think that perhaps Windows 7 is more <b>capable</b> on a netbook than Google's offering, then there's a clear win for Microsoft.</p>
<p>Only one thing's for certain. Microsoft now has real competition, and it's been a long time since they've had to react to that. I guarantee that Microsoft cannot simply purchase Google, continuing their procedure of Assimilate and Deprecate that they've become adept at when facing opposition. And that is definitely a good thing.</p>Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com14tag:blogger.com,1999:blog-8335314075484393906.post-57116741894336822472009-05-23T22:38:00.000-05:002009-07-09T22:41:35.232-05:00Windows 7 is supposed to run on lower-end hardware than Vista...but it doesn't?
<a href="http://blogs.zdnet.com/microsoft/?p=2643" target="about:blank">Here's the article.</a>
I really don't need to say much here, but I feel it necessary to point out that Microsoft has been saying for a very long time now that Windows 7 will be faster, smaller, lighter than Vista. And then they go releasing the specs saying your video card will need to be just as powerful, but you'll need twice the RAM and an extra 1-5 GB of hard drive space for the OS alone.
Sure, the OS may be faster, but it's definitely not smaller or lighter. As for me, I discovered the other day that I can boot Ubuntu 9.04 "Jaunty" from POST to desktop in less time than it takes to launch Microsoft Outlook. And Jaunty consumes less than 4 GB of hard drive space, less than 256 MB of RAM, and my desktop effects run fantastically on 64MB of video RAM.
I'm not gonna go into a rant here because the numbers speak for themselves. Just saying.Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com1tag:blogger.com,1999:blog-8335314075484393906.post-18958847157087363862009-05-14T22:42:00.002-05:002009-07-09T23:10:25.662-05:00What if AMD went out of business?There are a slue of lawsuits being filed back and forth between the major computer and video processor manufacturers. Intel's suing NVidia claiming that NVidia doesn't have the right to make VPUs that integrate with specific, Intel-patented processor assistant technologies. A company called Techsearch is suing Intel because Intel's hardware is based on an archaic model that was patented years ago by another company called International Meta Systems (IMS) that has since gone under, and whose patents were purchased by Techsearch. AMD is suing Intel on anti-trust charges, trying to convince the courts that Intel is trying to be anti-competitive and own the world's computing platform design.
A few years back, Intel finally manufactured a processor that marginally outperformed AMD's equivalent processor, and since then, rumors of AMD's demise have circled the Internet and geek culture. Buzz has also been passed around saying that AMD has a real knock-out product in its future, a set of integrated processor-based technologies that will blow Intel back out of the water, but that they do not currently possess the R&D budget to complete it. AMD's ultimate failure is like waiting for a bomb to drop because it would mean Intel's monopolization of the PC hardware platform.
I, like many others, do not want to see AMD go out of business, and not just because of what that means for Intel. AMD has consistently produced absolutely amazing processors. I've used AMD for years, and would not go back by choice. Even with the Phenom product line, advertised as a failure by Intel, and which consistenly have lower benchmark scores than the Intel Core 2 lineup, I chose the Phenom X4 to build my latest computer with. I did this because it was a full $100 cheaper than the Intel Core 2 Quad that I had been eyeballing with distaste, and the benchmark score disparity is really quite minor, if consistent. AMD has always provided their processors at a lower price than the Intel equivalent, despite that they have always benchmarked better right up until the Phenom line, and the "more for your dollar" principle has always applied.
But what <b>would</b> happen if AMD crashed? Could it actually benefit the computer market in the long run? I know this sounds crazy, but hear me out.
In this dystopian fantasy world, AMD goes under and Intel monopolizes the computer processor market. Intel will inevitably have anti-trust charges brought against them. After litigation takes place, some amount of reparation will happen, potentially benefitting the owner(s) of the AMD namesake and its operational facilities. But Intel cannot necessarily be forced into outright paying AMD to continue operation, so it's very likely that they might be forced into unlicensing the patent on the x86 architecture.
Long story short, AMD has always licensed this architecture from Intel, paying them regularly to continue to be able to produce processor chips that will perform beneath the same operating systems that have been in place above the Wintel *cough* I mean Intel platform all this time. In this manner Intel has always profited from AMD's success. But if the license no longer cost money, AMD's total cost of operations would be severely reduced, and the market would also be opened up to other, more independent developers. That means competition (which was the point of this legislation in the first place), and that's always good. I would venture to say that I would stop buying AMD hardware if there were another competitor that produced better hardware at the same cost.
This new processor market is definitely something that benefits everybody, since competition becomes more possible, and competition is the predecessor of innovation: hardware continues to get better at a faster rate. But do we really want to spend two or three years under the reign of a monopolistic Intel to get there?
On the other hand, Intel would have to play their cards well during this proposed AMD downtime. The fear goes that Intel would own the market and jack up their prices. Having no alternative to their products, the market would be forced to pay more and more and more for the same crappy products they've always produced, only they might end up being crappier than ever before because of the lack of competition. Intel would be safe from consumer brand-switching. However, if Intel knows or thinks in advance about the possible legislative resurgence of AMD (or other competitors), they would be wise to not do this. AMD, despite their license payments, have always managed to keep their prices lower, so even if their resurgence into the market greets us at the same prices as before their failure, AMD will still appear to even better for the dollar than they do now in reality. Any other competitors would be at the same advantage. This might even lead to a cycle where Intel is doused and bounces back.
Who knows, though? This is all just speculation on my part. In all reality, I want Intel to succeed. But only if AMD does as well. The competition is what keeps me able to build a system for $1000 that Dell sells for $4000. My prices stay low and I have a satisfying option. Live long and prosper, AMD.Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com1tag:blogger.com,1999:blog-8335314075484393906.post-83490232245830605752009-04-15T22:46:00.001-05:002009-07-09T22:54:49.958-05:00Cloud Computing - IT'S A REALLY BAD IDEA, GUYS!My CS instructor is a believer in cloud computing, and after reading an article that he sent to me singing the praises of cloud computing, I wrote this in rebuttal of my argument. For the record, I really like my CS teacher. He's intelligent and funny and he seems to know what he's doing. I just disagree on this one point. So I thought I'd post what I wrote here, just to continue the argument I started a while back.
<hr>
In truth, this article only goes to further my dislike of the concept. When we centralize processing power, when it becomes a utility that we buy into, a few things happen:
<ol>
<li>The center becomes an obvious and frequent target for malicious crackers who want my data.</li>
<li>Centralized processing providers become a corporate entity, and they become exactly as inefficient and careless as other corporations providing any other product or service (more detail in a moment).</li>
<li>The economy on the whole takes a downturn.</li>
<li>The experience of the individual computer user is unfairly diminished, as is the value of the non-cloud system we currently have.</li>
</ol>
In fuller detail...
<b>Data Mining</b>
I work for a company called <a href="http://www.mm-games.com" target="_blank">Multimedia Games</a>. We develop video slot machines, mostly for Indian casinos on reservations. When you play these video slots, you're really playing Bingo cleverly disguised to look like video slots. Since we are bound by a lot of legal restrictions, Bingo must be played exactly by the rules, and that means that multiple people need to be playing at once to get a game result (winners/losers), and that means that a network is involved. We have a rack of servers installed at each casino that plays the games and sends the win/loss data to the playerstations, then records financial data, gameplay data, marketing data, and a ton of other data related to casino-specific needs in a database. This data is crucial to the continued operation of my company and the individual casinos. Also due to high regulation, we go through tons of measures to make sure that this data remains secure. To host this data on shared devices or virtual machines in the public domain (meaning The Internet, not in the copyright sense), would be an extremely dangerous move with risks that propose a serious danger to our business.
This is a specific case, but it's a bad idea in general. Any time that you gather data onto one set of centralized servers, those servers become an immediate target for crackers who want that data for malicious purposes. Hardly a month (or a week) goes by without some kind of attack or security failure on an email server or MySpace or Facebook or an operating system in general becoming public knowledge. Technology is not infallible. In fact, when it comes to security, the best we can do is make it difficult on attackers. No amount of protection will keep out a person with enough knowledge and determination, and that alone is reason enough to distrust a centralized, utilitized data center.
<b>Corporate Inefficiency</b>
I'll use Dell as an example here. They used to be a small company reknown for their quality computers at affordable prices, backed by a solid warranty and service center. This, of course, attracted more and more customers, and these days, almost 50% of the PC market is populated with Dell desktops and laptops. Dell has infiltrated the corporate workplace and constantly battles Hewlett-Packard for workplace market share domination.
Dell became huge, and decided, as corporations inevitably will, to cut costs. Their hiring policy is now one where most entry level positions last six months before that wave of employees is fired and a new team is brought in. It keeps them from having to give raises to the largest segment of their employees. Their computers are no longer stable pieces of hardware, often installing 130 watt power supplies that bust every time there's a mild surge in power, and frequently pass excess power to their motherboards, which are made with cheap capacitors with a tendancy to explode, necessitating the purchase of an entirely new computer with surprising regularity. As an employee of a consumer electronics repair facility for two years, I can safely say that 75% of the computers we repaired were Dells, which is inconsistent with their market share, further proving their inefficency. In addition, Dell has failed to conform to well-defined computer hardware standards as a form of lock-in. The power supplies that fail all the time do not meet ATX form factor (or mini-ATX, or micro-ATX, or BTX or any other standard), which means that when one breaks, you have to pay over $100 to buy another low-wattage PSU straight from Dell. It's a way of monopolizing one's own product, despite the fact that better products exist through a million other means. It's unfair to the consumer, especially when you consider that most of their technical support is outsourced to foreign countries, which decreases customer satisfaction, but increases profit.
The point is that once people adopted Dell as a trusted name, Dell grew and grew and grew until they were too large to sufficiently support their operation. This will undoubtedly happen to cloud computing within a few years of its widespread adoption. By adopting this plan of allowing corporations to run our operations, we open ourselves up to this kind of reduced functionality and constant problems, which in turn affects our own ventures. Once we're locked into contracts, or, more likely, once it becomes too expensive a prospect to move our data back onto our own private server operation, the company providing a cloud service essentially owns our business operation and controls it fully. The only goal they need to make is to ensure that the cost of keeping data and load-balanced processing through them is cheaper than the cost of returning operations to our own servers and network.
In addition, a corporation must focus its attention on the customers that generate the most profit, and ignore customers that use the service, but also belong to niche markets. This can be seen in Netflix, with their on-demand digital viewing service. Up until recently, it required an Internet Explorer ActiveX control to work. People who know that this is a major security flaw complained about it. They want it to work in Firefox. Netflix has now accomodated for that by using Microsoft Silverlight instead, but nobody likes Silverlight, and this is an obvious example of instituting the DRM that nobody likes via a corporate partnership with a company that nobody likes. Still, Mac users cannot use this service, and neither can Linux users, so both of those niche markets are ignored. A centralized cloud computing service would only end up doing the same thing with more devastating results. "What? You don't like the way this all works because you're a Mac/Linux/Firefox user? Tough! You don't pay this company enough for us to care!"
<b>Economic Downturn</b>
Because of this country's corporate, enterprising outlook on literally everything, we have come to the conclusion that spending less money is the equivalent of doing better business, but this is sometimes untrue. One example that they cite in the article is that of the expense of IT professionals and the cost of running a data center. The suggestion that is made there is that if we unify our data center into one center for thousands of companies, we can pay one group of IT professionals instead of a thousand, thus lessening expense overall.
The problem with this line of thinking is that it puts 999 companies worth of IT professionals out of a job. If a company needs to cut expenses, it should cut expenses related to product and physical material. It should use less paper or turn off its workstations at night or settle for cheaper workstations (since, as the article says, we're only utilizing 10-30% of their processing power anyway). It should not dispose of human capital. When citizens get paid, citizens then have money to put back into the economy. Reducing jobs is the last thing we want to do. Have we learned nothing from the current recession?
When we outsource to other countries, the United States as a whole loses money because we're paying US money to foreigners, removing the cash from our economy entirely. It's only a natural Step Two, if you will. Once we've centralized a ton of data centers into one in America, a company will realize that they can just move that data center overseas, pay people less to maintain it, and not have to worry about environmental stipulations that cost money in the States. Everything about the cloud computing concept screams, "Failed economy!" Don't you think it possible (and very ironic) that the very things that allow us to be a developed country could turn us into a third-world country?
<b>Diminished User Experience</b>
When something new happens technologically, it doesn't take long for it to reach the entire community of tech users. It extends into both the workplace and the home. Currently, less than 25% of Internet users have broadband access, and this statistic does not include the nearly 50% of computer users who do not have Internet access at all. If the cloud computing concept ever hit home en masse, it would be detrimental to the vast, vast majority of computer users.
Also, no matter how you cut it, there is simply no way to make centralized processing work for gamers, whose demand would be that 120 images in 1920x1080 resolution be passed to their computer over the Internet, decompressed, and displayed on their monitors every second. No broadband connection exists that can carry that kind of data that quickly, and even then, the user is still utilizing most of his CPU's power to decompress and display data. Not to mention the fact that Time Warner is about to impose download caps to make money off of people who don't know any better. Cloud computing will likely screw over anybody with this type of connection plan.
Since cloud computing in the home means that a computer would require a broadband network connection, laptops would lose their primary purpose - portability. If I can't use my laptop without an Internet connection, then why even bother with one? If I can't go to a job site in Middle of Nowhere, Texas and access my data, what's the point in a laptop?
There are a ton of reasons that cloud computing is a really bad idea, but the funny thing about all of it is that cloud computing already exists on a much smaller and safer scale. I utilize this for myself. I run a Linux server out of my home with a dynamic DNS address that always points to my server, despite the dynamic IP address that my provider supplies me with. I run various forms of connection back to that server - it runs Apache, an FTP server, a Samba file share, and an SSH server - but the most important is X window forwarding over SSH. It's fast enough to use, if a little slower than actually being in the presence of my computer, but I can always access my hard drive and every program running on that server as long as I have an Internet connection. But even when I don't, my client computer can still run software and save data until such a point that I can transfer the data back to the server. There's little reason for an attacker to attack my computer because I am only one person who does not store sensitive info on my hard drive. It's much better to attack, say, Google, to gather user data from millions of accounts and years of usage. I can use my server from anywhere, and I definitely benefit from this in ways that I could never benefit from an actual cloud computing service like the one defined in the article.Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com3tag:blogger.com,1999:blog-8335314075484393906.post-81351251272453433302009-03-30T22:55:00.000-05:002009-07-09T22:57:15.341-05:00Cloud Computing Follow-upRamsey replied via Twitter regarding my post on cloud computing:
<i>I think the new in cloud computing is the idea of a cloud OS that keeps most of the data intensive parts on a server.
this would in theory allow someone with an old computer or a netbook to run a fast effiecent OS. The cool part comes when they
starts shareing the load, as in all of the computers running that os make up the cloud in some small way. Speeds unimaginable!</i>
My response follows:
When the data-intensive parts of computer operation reside on the Internet, everybody's speed slows down because...
<ol><li>Even if there's a remotely operating OS, my netbook still needs a client OS to connect to what that server OS is outputting. My netbook runs perfectly quickly with a decent Linux OS on it. Windows 7 supposedly runs pretty quickly, too. But I don't want to have to pay for a licence to <b>two</b> operating systems just to use one computer. Windows is too pricey already, and that's a huge reason why Linux netbooks sell. A Windows licence makes up 20% of the cost of a Windows netbook. It makes up more than 50% of the cost of a cheap-ass E-Machine. And you'd like to raise that even higher just so I can make my data vulnerable and out of my control?</li>
<li>If the data resides locally on my computer, then all of that data must be transferred across the net to get to a server where the data can be processed. Considering the generally atrocious upload speeds offered by most ISPs (and hopefully I'm not on some kind of cellular network doing this), my computing speed will decrease drastically, and that's the opposite of what you propose, and the opposite of the direction technology should be heading.</li>
<li>If the data resides on the net, on some server in the cloud, and it can be processed on the same server that my data and the data of hundreds of other people is being processed, then my data must wait until its place comes up in the queue of data processing. It would take some pretty powerful equipment to make that happen. I'm not saying it's impossible, just that I don't want to have to pay for that service when I'm perfectly capable of making it happen on my own.</li></ol>
And there are a plethora of reasons why cloud computing is either bad or not viable.
<ul><li>In 2008, <a href="http://www.internetworldstats.com/articles/art030.htm" target="_blank">61.5 million users</a> in the United States were connecting to the Internet via some type of broadband access. That sounds like a big number until you consider that <a href="http://www.internetworldstats.com/stats14.htm" target="_blank">248.2 million users</a> are online. That means that the rest of these computers connect via dial-up or some other slow connection. Translation: less than 25% of Internet users are connecting via broadband, and that's not even considering the number of people who are not online at all, but still use a computer at home. Generalized cloud computing is not viable, at least not in the sense that it's mandatory.</li>
<li>In the setup that I described in my <a href="http://gradysghost.doesntexist.com/blogman/read.php?id=25" target="_blank">previous post</a>, my data is still being passed through the cloud of the Internet, but resides on hardware that I own and control. In the popular cloud computing concept, it's on hardware that I have zero control over, and which other folks' data is also residing. This makes it an immediate target for data miners who want my data. Nothing technological is impermeable, not even my setup. But with my setup, there is less likelihood of an attack because I am only one person. This is the same reason why Mac OS X, while proven to be less secure technologically than Windows, is still a safe alternative to Windows. There are fewer people to attack. A successful general attack for Windows will yield a higher success rate than the same type of attack for Mac. If, say, Google were to have a cloud computing service with millions of users, that's a better place to attack than my single-user setup on my personal hardware. Someone would practically need a personal vendetta to even bother with mine.</li>
<li>Even with broadband access, cloud computing is still slow. X forwarding is fast enough to use, but not fast enough to be my ideal method of using a computer. I use it because it's convenient, because I can manage a web server from my cell phone or from work. Because I can write documents or stream music or move my files around on my home computer even when I'm not there. But do I dare try to browse the web using a remotely operating browser? Do I try to do any amount of image editing remotely? Do I game? Of course not! Because even though my server has a 2 Mbps upload speed (roughly 250 KBps), that's still not fast enough to transfer that much visual data back and forth at a constant rate, much less all of my input. Cloud computing will take a very long time to be able to catch up to that.</li></ul>
We have a long way to go before cloud computing becomes a viable option. I'd rather just stick to a configuration where it's a possibility for me when I need it, my data is as secure as I make it, where I have total control over it, and where I do not try to exceed its limitations. I can't let my computer usage slow me down. When I'm thinking faster than my computer is, isn't my comuter made unneccessary?Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com2tag:blogger.com,1999:blog-8335314075484393906.post-72991900493721412212009-03-27T22:57:00.000-05:002009-07-09T22:58:48.318-05:00Cloud Computing - IT'S NOT NEW!Cloud Computing - It's the new buzzword that profiteering hopefuls like Google and Microsoft have been using lately to sell products and services on the Internet that they have not fully developed. Cloud computing is the concept that your data will be kept on some server or set thereof somewhere in the "cloud." The "cloud" is meant to mean "somewhere in the Internet where you have restricted access, but can use certain portions of on your personal computer, provided you can supply proper credentials."
An example of this that is already implemented: <a href="http://docs.google.com" target="_blank">Google Docs</a>. Also: <a href="http://www.zoho.com" target="_blank">Zoho</a>. These are web-based pieces of software that let you work on various types of documents that you might ordinarily use MS Office or OpenOffice to work on. Your data is stored on their server, and when you log in, you can access the stuff that belongs to you. It can be edited, printed, exported locally, or imported from a local file.
This is pretty cool, and I've used both of these services before. Still, I have my reservations about agreeing to malleable terms of service and then storing whatever personal information online. It just seems like a really good opportunity for data miners. Nothing is uncrackable. You need only look as far as a <a href="http://www.mckeay.net/" target="_blank">decent computer security blog</a> to see that what we perceive as impenetrable is really completely insecure when under the influence of the right people.
The other thing that bothers me is that, like "Web 2.0," the terminology has become a marketing buzzword meant to indicate something like "the way of the future." It's intended to arouse and excite. I have no doubt that some people jump onto this concept like nothing else when it becomes more popular simply because it matches a definition for "cloud computing" that they never had to begin with. And the truth of the matter is that cloud computing is not a new idea. It's been around for years. Offhand, I couldn't give you a number of years, but it's been in use for longer than I've been using it.
That's right. I've been using cloud computing for some time now, and it's a pretty cool thing. A friend of mine turned me on to a little thing called X forwarding. Perhaps this terminology requires some explanation.
I run Linux at home as my primary operating system. My particular distro uses the X window system, which is pretty common. X is what allows windows to appear on my desktop in various forms, in accordance with the description provided by whatever software is using those windows. X treats the relationship between computer and monitor as a server/client relationship. And it supports "server" forwarding. That is, it will send the graphical definition for what a window is supposed to look like to any other X server with proper credentials.
What this means I can do is be on my laptop, or be on the computer at work, or head over to a friend's house, or go to the library (assuming I can install some basic software on it), and run some free and open source software called XMing. It's an X server for Windows, Mac, and Linux. Then I establish a connection between that computer and my server through SSH (easily done with another FOSS app called Putty), and then I type the name of the program that I want to bring across the Internet. Voila! I am running a program on a remote processor, accessing my remote drives, and working with my own remote data. I have cloud computing.
The best thing? All FOSS. Doesn't cost me a thing. Microsoft is talking about introducing a paid-for service like this called Azure, and while I haven't done extensive research into it, it seems likely and plausible that you'll be locked into their software when using it. That is to say, the whole reason "cloud computing" has become a marketing buzzword is that it really does have profit potential. If Microsoft or Google can get you paying for their specialized "cloud" service, then you get stuck into using their software, fearing transistion into another format or service.
I'm glad I already operate more or less on the fringe of this stuff.
Oh, and I know what most people will think when they read this. It's too difficult for Joe the Plumber to use. Well, it might currently be too difficult for Joe to set up for the first time, but if someone were to set this up properly for Joe to begin with (for a small fee, I'm sure), he could <b>use</b> it just fine.Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com1tag:blogger.com,1999:blog-8335314075484393906.post-791852853358757352009-02-03T22:59:00.002-06:002009-07-09T23:09:39.630-05:00The Mojave ExperimentI'm pissed. And not just kinda. Like, really really pissed.
Microsoft has recently put forth a series of advertisements in promotion of Windows Vista, their latest stodgy, crippled, overpriced operating system, in which they trick innocent people (possibly actors) into thinking that the operating system they're using on a demo computer is the "brand new Windows operating system, Windows Mojave." They've set up a website as well (MojaveExperiment.com) to spread the word. Then comes the big reveal where they tell their victims that it's not really this "Mojave" thing, but Windows Vista. It's all very carefully planned and misleading. Some evidence of that follows. For the record, I had to view several videos to gather all of the information that is here. I actually had to work to find out all the relevant information. All the data is not present at any one single point, and I imagine this is all part of their need to trick folks into trying the operating system.
One video on the site says that the laptops they're using for the Mojave tests are "brand new, straight out of the box and into the hands of the users." Another video says that the laptops are actually these two Microsoft employees' work computers, and are at least a year old. A third agrees that they're a year old, but claims that the laptops are really the employees' personal computers. It seems the only consistency here is inconsistency. And don't tell me that two computer geeks who have owned a computer running Vista for over a year and use it for work haven't modified the software in all that time. Assuming it came preinstalled with Vista (and it did), I should hope that they've at least installed Service Pack 1 since then. That's miles better than it used to be. Also, I see a strange lack of desktop icons that should be there on a brand new, bloated as hell, crammed to the brim with advertisements, new HP computer. Also the desktop background's different. And I'm sure they've installed MS Office and a few other programs since Vista comes with absolutely no software that anybody could use to do any kind of job, especially one within Microsoft. So there's no way you're going to get me to believe that these computers have not had any customization over the past full year. Hell, you can barely convince me that they've been running for a full year without needing an OS reinstall.
They also say that they're not "special" laptops, just HP dv2000 series models running Intel Core 2 Duos at 2.2 GHz with 2 GB of RAM and an NVidia GForce 8400 video processor. This laptop (when new, and there aren't any more new since they are, in fact, over a year old) cost about $1700. That's a pretty special laptop, especially considering that most affordable laptops still don't even have dual-core or 64-bit processors in them, and usually 1 GB of RAM or less. I would go as far as to say that the laptop they use in these videos is at least twice as powerful than your average affordable laptop. The video card in that puppy is one model number lower than what I have in my desktop to play some pretty high-end video games. It's not the best thing on the market for desktops, but it's pretty close to the best thing on the market for laptops. There's nothing that this laptop shouldn't be able to do. I have run web and file servers on less.
Another video talks about security. If we are to believe the video, Windows Defender makes Vista "60% less likely to be infected by a virus," which, in my personal experience, is untrue. In my personal experience, Windows Defender does nothing at all. The official Microsoft statistics for it show that 22 million pieces of spyware were detected by Defender during its trial run under Windows XP, and that 14 million of those were removed. That's where the 60% statistic comes from (it's actually 63%). However, we are not told from these statistics (as it is impossible to tell us) how many pieces of spyware it did not detect at all. Also, we're told that you're 60% less likely to get infected, which is not the truth. What this shows is that you are 100% as likely to get infected, but 63% of it will be removed. A 63% removal ratio is not a good ratio. Not good at all. If I have one piece of spyware on my computer, I want it gone. I don't want to rely on a nearly fifty/fifty chance of it actually being removed.
Another demo shows us that programs can be run in a Compatibility Mode. This is a counter to users' complaints that Vista is not compatible with a great deal of software and drivers. However, we all remember XP's Compatibility Mode and how it never worked. In my personal experience, the one in Vista is no better. I am at a loss to find anything other than how-to articles related to the subject, so I have no third-party opinion to share with you on the subject of Vista's Compatibility Mode. The demo in the video shows the experimenter running what appears to be a Bluetooth application in Compatibility Mode, though we are never shown that it didn't run in Normal Mode first. We have no proof from Microsoft that this is a valid test.
The "organization" video comes with a notice from the gentleman on the right-hand side that these computers are "definitely, definitely not top-of-the-line." Then the guy on the left lists the amazing specs of the computer. Then the guy on the right reiterates that it's "definitely not top-of-the-line." They tell you that this computer can be purchased for $650 to $700. This, of course, is not a lie. You can buy that computer at that price because that computer is now a year old and has no warranty left on it. A new one, as I have said, will run you well over $1000. This video emphasizes the Start menu search function, which I'll admit is a pretty cool feature. Too bad they stole it straight off Mac, but that's not the point. The point is they've actually implemented a cool feature. Not to belittle it or anything, but their method of demonstrating it here is somewhat flawed. It works like this --
<blockquote>LEFT: How would you start the calculator under Windows XP?
RIGHT: (fumbling for words) I'd go to Start -> Programs -> Accessories -> Calculator -- It takes too long.
LEFT: Well, check this out.</blockquote><i>Left clicks on the Start menu, types "calc" and up comes the calculator at the top of the Start Menu.</i>
<blockquote>LEFT: See? You don't even have to type the whole thing!</blockquote>The trouble with this exchange is that the filename for the calculator application is, in fact, calc.exe. So when you type "calc" into the start bar, it's finding calc.exe, not necessarily the term "Calculator." The same affect can be achieved by clicking Start -> Run and typing "calc" in Windows 95, 98, 2000, NT, XP, and even Vista. Though this demonstration is flawed, I still will admit to the usefulness of the Start menu search function. It really does work pretty well, but only if you have the Windows File Indexing service on all the time, which can be taxing on the proc and memory of your computer. It's something I ordinarily turn off because, well, I know where I keep my files. I have organization and don't rely on my computer for such.
There are other new features involved, and you can watch all these demos at the website. There are some really cool things that Vista can do, but the long and short of it is that it's way too heavy on your hardware to be considered a useful operating system. For instance, when Vista creates thumbnail images for pictures on your hard drive or thumb drive or whatever storage medium you've chosen, it keeps the full-size image in memory, then performs a shrink command on it, then displays the shrunken image in the explorer window, keeping the larger image in memory. This created a problem for me when I was looking through photos taken with a professional camera. Each image occupied 15-20 MB depending on the color range in the picture. Instead of taking the images one at a time, shrinking them, then keeping the shrunken version in memory, it tried to load several hundred 15-20 MB images in memory at once. 1GB of memory couldn't hold it all. It fell back on Page File. I had to force reboot the PC to get out of the function due to the interconnected design of the OS and double my computer's memory just to be able to browse.
My point is that you should go ahead and try Vista if you want, but for God's sake don't pay for it first. Microsoft is running an extremely dishonest advertisement to overcome a lot of their software's completely valid detractions. They're not fixing much because to do so would involve writing an entirely new OS, and they'll be damned if they'll do that and not ask for another $200. It's bad enough that they're asking for that right now for an OS that is inferior in many ways to a great deal of free-of-charge operating systems. Just see my link list at the top-right for more information on this type of stuff.Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com1tag:blogger.com,1999:blog-8335314075484393906.post-89233104452844479272009-01-16T23:10:00.001-06:002009-07-09T23:40:03.172-05:00CompetitionFernando and I discussed operating system market share the other day, and I mentioned something mentioned in <a href="http://linux.oneandoneis2.org/LNW.htm" target="_blank">this article</a>: that Linux users don't care if the Linux operating system, (speaking collectively of all distributions thereof) becomes widespread. To quote Dominic Humphreys, the author of the article:
<blockquote>Linux is not interested in market share. Linux does not have customers. Linux does not have shareholders, or a responsibility to the bottom line. Linux was not created to make money. Linux does not have the goal of being the most popular and widespread OS on the planet.
All the Linux community wants is to create a really good, fully-featured, free operating system. If that results in Linux becoming a hugely popular OS, then that's great. If that results in Linux having the most intuitive, user-friendly interface ever created, then that's great. If that results in Linux becoming the basis of a multi-billion dollar industry, then that's great.
It's great, but it's not the point. The point is to make Linux the best OS that the community is capable of making. Not for other people: For itself. The oh-so-common threats of "Linux will never take over the desktop unless it does such-and-such" are simply irrelevant: The Linux community isn't trying to take over the desktop. They really don't care if it gets good enough to make it onto your desktop, so long as it stays good enough to remain on theirs. The highly-vocal MS-haters, pro-Linux zealots, and money-making FOSS purveyors might be loud, but they're still minorities.</blockquote>
There's some truth in what he says, but Nando, like always, made me see it from a different perspective. Truth be told, ask any one of us Linux users why we use Linux instead of Windows, and we'll get all high and mighty. What? Me use Windows? Use a stodgy piece of software with almost no security where I have to pay hundreds of dollars every few years in order to keep my computer from being a useless pile of garbage and end up buying a new computer every time I do just because Microsoft's operating system can't run on medium-end hardware? Pay oodles of money for expensive software that matches no standard and for which I have to pay extra money for basic support? No way! I'm tired of dealing with it!
But it's this last sentiment that proves inaccurate the statements made by Mr. Humphries. I'm not trying to belittle the article or call it wrong or anything. In fact, this article is usually the first thing I give to somebody who shows any interest in Linux because it's a great preparation for someone about to make the switch. It's just that we like to complain about Microsoft and how they're all over the place and how those of us who are the official family and neighborhood computer technicians end up supporting broken software constantly.
The hypocracy here is that we'll complain about the forced ubiquity of Windows (and enough's been said about the Microsoft tax, so I won't go into <i>that</i> rant here), and we'll make the argument that Linux is the ultimate fix for Windows, but (at least in Dominic Humphreys' world) we supposedly don't care about Linux's market share.
If we do not care about market share, then we are content to sit in the background screaming about the Linux solution and doing nothing to enact it while Microsoft continues to write shitty software and install it across the globe, laughing all the way to the bank. And the reason they're laughing?
Because they didn't even have to try.
Competition breeds innovation. The closest thing to competition that Microsoft has is Apple, but Apple does not count as true competition. Apple does not sell a run-anywhere-on-anything, just-add-drivers operating system. They sell hardware that looks pretty that happens to have an operating system on it. And if that operating system happens to be so easy to use that my grandma can use it, then let Apple sell five-hundred-dollar hardware to my grandma for two thousand dollars. But Mac OSX won't run on anything, only the hardware it comes on. Windows will run on a lot of different hardware, and is therefore a fundamentally different product, a product much like Linux. Linux is perhaps the only hope at creating competition with Microsoft, and in many ways, Linux has the advantage.
Linux is usually free of cost. It doesn't have to be, not by its own philosophy, but you certainly don't have to look far to find a free version. If it's something for your average Joe to use, you really only need to look at something like <a href="http://www.ubuntu.com" target="_blank">Ubuntu</a> or one of its derivatives like <a href="http://www.linuxmint.com" target="_blank">Linux Mint</a>. So it's not just a matter of how much cheaper it is than Windows, it's a matter of it not costing anything at all.
Basic support is free. Forums are all over the Internet. IRC chat rooms are available. It's a community effort and within the community is where you'll find the best help. You're getting responses from the same people who wrote the software in the first place, not some schmoe in India whose name is most definitely not Steve, plus you're not getting charged a single penny for any of it.
Desktop effects, security, and tons of free software make it an appealing choice. Every time I show somebody my desktop cube or my rain of fire when windows close or my ability to remotely control my computer in extremely granular detail, they are blown away. When I show them that my laptop can hop onto the closest wireless network in two clicks instead of countless menus, from a single drop-down box in my tray instead of several enormous windows, they say, "I wish it were that easy on my computer." My computer also doesn't slow down over time, fragment files, become unstable, crash, catch viruses or spyware, and I never have to perform regular maintenance to be able to make these claims. People like that. They like that they don't have to work for their computer.
So we should care about market share, because if we do, we can force Microsoft's hand. I would be stunned, but not offended, if Microsoft produced a quality operating system, or a browser that doesn't fail miserably when trying to conform to web standards, or a mail client that uses less than 1.5 gigs of hard drive space to store 137 megs of email. In fact, I would be thrilled were that the case. But Microsoft has no real competition, so they have no real reason to do these things. If Microsoft lost significant profit due to the disintegration of their market share, that would mean that enough people are using Linux to make any kind of FUD marketing Microsoft could do useless, and they would be forced into writing better software. I might be tempted to use a Microsoft operating system primarily if it could provide a significant benefit over my current Linux configuration. I'm one of many customers Microsoft could attempt to win back.
Linux also shows potential. It's been around since 1991, which is pretty impressive when you consider how small its user base is, and that means that its user base is strong and emphatic. The market share for desktop versions of Linux (non-server) has doubled in the past year, and that means the market share is growing. Granted, it's grown from one percent and a half to about three percent, but it's expected to grow to nearly ten percent soon. This is largely because of Linux's ability to run quickly on low-end hardware. Linux has become a popular choice for those "netbook" things, which are insanely popular. Already, Microsoft has been forced to react to this by making the upcoming Windows 7 run on less hardware. Linux still has the advantage here, though, because the customer buying the netbook (typically designed to be inexpensive) isn't having to pay for the operating system. I'd say that Windows has the advantage of popular software, except that they don't. My suspicion is that Windows 7 will have just as many operational problems and software backward compatibility problems that Vista had, while Linux has and will remain less vulnerable to those.
Contrary to popular Linux belief, Linux users do need to help populate the operating system if they care about the future of technology. Like I said before, competition begets innovation, and Microsoft is apparently already feeling the heat.Anonymoushttp://www.blogger.com/profile/02957471586871393621noreply@blogger.com0