Saturday, December 22, 2007

Starting a Technology Podcast

My friend Brian Westfall and I have technology discussions pretty regularly. We talk about computers, game console games, home theater, among other things. And we talk a lot. Sometimes the discussions actually get pretty deep too.

So recently we decided that we were going to turn some of our conversations into podcasts that others can download and listen to. The result is called "Tech Squawk."

We hurried to record the first episode before Christmas, hoping that some of the stuff we recorded might be of value to someone. It's just a few days before Christmas now, but maybe somebody will find out podcast interesting.

The official web site for the podcast is For anyone using an RSS aggregator, the feed address is

Like I mentioned, the first episode was rushed, just to get something out there in time for the holidays. Which also means it wasn't edited down to a more concise size. So plan on taking up about 90 minutes if you want to listen to the whole thing.

We'll get to our normal weekly schedule after the first of the year. And future episodes will be much shorter. Hopefully less than 30 minutes, because longer podcasts just get a little dull.

Enjoy! And thanks in advance for any feedback!

Monday, December 10, 2007

HD-DVD vs Blu-Ray

I was researching HD-DVD and Blu-ray for an online posting, and some of the information I found is pretty interesting. Here's some of what I found, based on information from and Wikipedia.

HD-DVDs are nearly all released on its higher capacity 30GB format. As of this writing, 51.97% of Blu-Ray discs are released in the lower capacity 25GB BD25 format. Thus the apparent size advantage of Blu-Ray isn't currently being utilized by slightly approximately half of the currently published BR discs. The HD-DVD specification was recently updated to support 17GB per layer, up to three layers (or 51GB) per disc. Blu-Ray officially supports one or two data layers up to 50GB total.

38.11% of Blu-Ray discs are released in the MPEG-2 format. This is the older format that has been blamed for poor video quality on early Blu-ray discs. While the 50GB BD50 discs with MPEG-2 are certainly better than the 25GB BD25 discs, AVC and VC-1 encoded discs offer much better image quality, even comparing these formats in a 25GB format to a 50GB MPEG-2 disc.

Before buying a Blu-Ray disc, check the site to make sure it isn't a 25GB MPEG-2 disc, as these are the ones with image quality problems. A 50GB MPEG-2 disc will look fine for movies of average (or shorter) length as long as there isn't much bonus material on the disc. I am not aware of any general image quality issues with HD-DVD discs.

Other format differences: HD-DVD supports Managed Copy which allows you to copy your movies to a home theater PC, though I'm not aware of any implementations of this just yet. Blu-ray does not have any such capability.

HD-DVD has more interactivity features than the BD 1.0 specification, though BD 1.1 attempts to address this. In the real world, this means that HD-DVD discs currently tend to offer more visually appealing menus and more disc features. This is likely to change slowly with the adoption of the BD 1.1 and 2.0 specification.

Both formats support the same video formats. Audio format capabilities are effectively about the same, with some variation on which competing formats are utilized for different levels of compression. Both support uncompressed audio in at least 7.1 channels.

Both formats support full 1080p resolution at regular TV refresh rates as well as 24p, with effectively all movies being released in that format. The implementation on the discs is slightly different, but the data is the same. Some HD-DVD players only output up to 1080i, but many 1080p TVs are fully able to reproduce the original 1080p signal for display. To take advantage of a smooth 24p-based cadence, players in either format must be connected to a 72 or 120Hz television via HDMI. Connecting to any other type of television or using any other type of connection will result in 3:2 pull-down being added to output video at 60 Hz.

Movie studio support for the two formats is pretty much a toss-up, with current offerings in both camps having almost exactly equal numbers of titles available.

HD-DVD does not have any region coding requirement, so you are always free to import discs from overseas and play them on any player. Blu-Ray uses three region codes (A,B,C) to make sure that encoded discs are not played in regions other than those they are intended for, similar to the way that DVDs are region coded now. This in some cases is a disdvantage for HD-DVD, as sometimes a disc release for a short time is delayed while the movie continues to show in theaters elsewhere in the world.

In several cases if you are not able to obtain a movie title on the format of choice in your home country, it might be available on the other format elsewhere. If you have elected to use Blu-ray as your format of choice you need to make sure that the disc you are purchasing from overseas will play in your region, however.

Some/many HD-DVD discs are available in a "combo" format (usually at a higher cost) which contains the high definition version of a disc on one side, with a standard DVD version available on the opposite side for playback in regular DVD equipment. Blu-ray does not offer a similar capability.

Discs prices are pretty similar between the two formats, with both being significantly more expensive than DVDs.

Saturday, December 1, 2007

Making a Webcast (Creating my Photography Class)

Anyone following my life in the last couple months knows that I recently taught a 5-episode class on photography techniques, and some of you even watched along as I broadcast it across the internet. What a lot of you probably don't know is what it takes to make a webcast happen. So I felt it appropriate to share a little bit of the "behind the scenes" stuff that goes on.

Over the last couple of years I have had a number of friends approach me and ask me if I would be willing to teach them how to take pictures. Of course I'm happy to do so, but in most cases we never got around to doing anything about it, and when we did, I didn't have my thoughts very well organized and honestly I don't think I was really that much of a help. But earlier this year I thought about organizing and teaching a formal class.

It all started by planning out what material I wanted to cover in the class and organizing it into 7 general topics that would fit in about an hour's time each. Since I had been thinking about teaching a photography class for months before even announcing it and most of my thoughts were already fairly well organized as to what I was going to teach, I was well on my way. But I needed to categorize it, and present it all in an order that it would make sense. A lot of time was spent deciding just how much background I would give with regard to a specific point of information, so it turns out I ended up researching, categorizing, and coming up with graphics for about five times as much information as I would be presenting in-class. But that ended up working great because when a student would ask a question, in almost all cases it was along a topic that I have had experience in, or was right in line with something that I had researched beforehand. If someone asked me, for example, "how does depth of field work mathematically?" or "how do you define 'in focus?'" I was actually prepared with an answer for that, complete with graphics to illustrate.

Once the basic outline of the class was set, I had to build a web site to present information about the class and allow students to log in to update their interests and schedule. My brother Brent actually did the vast majority of the programming for it and the biggest chore I had was to design the look of the site and integrate Brent's code into it to make it work. There was about a full day of labor there. But what you nobody ever saw was a custom-written piece of software that I put together to maintain the database on that site. Originally when planning the class I was going to have each class twice in a week so that the most interested people could attend, and the timing of those classes was going to be decided based on when interested students could possibly attend. The software I came up was designed to allow students to input their schedules for the next week, and it would then actually calculate the best days and times of day for between one and three episodes of each class based on who was interested in a given topic and their schedules. I could, with a single mouse click, see when I should hold a class to get the most interested people there, who could attend, and who would not be able to attend. It was actually pretty slick even though it was only eventually used to decide that Thursday at 8 was the best overall time to hold a single class.

Creating the opening graphical video sequence was another task that had to be performed ahead of time. By the time I actually sat down to create it I had a color scheme in mind, so choosing the appropriate background graphic wasn't that hard. The most time consuming part was finding the appropriate pictures and adding them to the timeline with just enough motion and effects to make things interesting. Music was selected, and a DVD made containing the full opening/closing graphic sequence in two versions: both with and without music after the opening (so I could have music playing under my voice as I was introducing the class). Total time creating the opening sequence was about 10 hours.

Next came building a set. My regular living room isn't very camera friendly, and certainly wasn't laid out in a very classroom-friendly configuration, so I had to actually rotate the room 90 degrees from normal and put up fake walls using photo backdrops (called "muslins") for the wall behind me and to create a fake wall behind the projection screen to hide the home theater system behind it (incidentally, I still haven't taken them down; they are much more attractive than the wallpaper behind them). I also had to build the "desk" that I used, which if you were to see it in person, is very oddly shaped and wouldn't make any sense in any other situation. Building and painting the desk took about a day, putting up the fake walls and lighting the room took another half a day.

Next came the video cameras (all 5 of them), two of which had to be mounted to a wall because there was no room for tripods. Running cables for the four microphones (3 on the ceiling for the three couches and my wireless lavaliere mic) and 15 video sources (yes, there really were 15 video sources) to my "studio" followed, but ironically the most challenging part of setting up the set was getting video from the laptop computer I was using to my monitor, the projector, and out to the web simultaneously. I needed to be able to see my computer desktop but present the informational slides simultaneously on three different types of display devices (projector, LCD and CRT monitors, and output to the web). I won't go into any more detail, but it was actually the biggest technological challenge I had setting up for the class.

Video from the two main cameras was shot by one or two operators who generously donated their time (David Skousen and Paul Green, thank you!), and sent down the hall to the studio where former roommate and great friend Brad Riching selected different between the different sources using a proprietary video switching solution (more about that in a minute). Brad also ran the audio mixer to combine just the right amount of sound from each of the four microphones (and when necessary the music from the DVD), and the audio, along with the video was sent to processing equipment to make it look and sound better for DVD and web broadcast. In short, each microphone is run through multiple processors to make sure that levels are both loud enough to hear and not so loud that they distort, another so that speech is intelligible, and after all of that the final signal is processed yet again in the same manner before being sent to the recorder and out to the web. Video is handled in a similar way. (On a side note, if all of the audio and video processing equipment used was stacked, top to bottom it would be about 10 feet high.) The processed audio and video are distributed to a hard drive-based video recorder, a computer to encode it for web distribution, and three on-set video monitors simultaneously. That way I (and my class) could see exactly what was going out to the internet.

The encoded video was sent to an in-house server, and was pulled from that server to another that has a direct internet connection, provided by my great friend Brian Westfall. When the webcast is watched from home, it was actually Brian's server that you were connecting to because my internet connection at home isn't anywhere near fast enough to handle more than a couple people watching.

For each class I began researching topics specific to that class about a week beforehand. On average I'd spend about 8-12 hours reading information online from different photography web sites and in magazines to get a better feel for other photographer's techniques so the material being presented wasn't from just my own experience. On the day of (and sometimes the day before) a class, I would spend the entire day searching for sample images, taking pictures, and building the slides that would be shown in the next class. I also would spend about 3-4 hours building an outline for me to follow while I am teaching. In the first couple of episodes you'll see my outline on the desk (incidentally printed on blue paper rather than white to avoid blowing out the video camera exposure), but by the third I began using a new "notes" feature in my slide presentation software (again, more on that in a minute). Originally I had a video prompter setup so I could read my notes without having to look down at my desk, but I found I wasn't using it so it was abandoned after the second class.

Another helpful piece of technology was the IFB (In-ear FeedBack earphone monitor) I was wearing in my ear as I was teaching. While the class was going on I could hear the introductory music, myself, the classroom microphones, and occasionally Brad would speak up and remind me of something or give me helpful hints. On numerous occasions I would forget to mention something, or misspeak, and Brad would come on and tell me without anyone else even knowing that was going on. Several bits of misinformation and cases of missing information were avoided because of that one small thing.

Okay, so on to the proprietary technology. As it turns out, other than Windows and the photo imaging software I covered in the last class, every piece of software used to make the class was something I had written at one time or another. The slide presentation software is something I started working on about 3 years ago and it has grown quite a bit since that time. I don't know if anyone noticed, but the slides are of much higher quality than those made by PowerPoint, and I could create slides in real time. As far as I know there isn't any other software out there that is capable of doing that, and it certainly made the class run more smoothly since, as students would ask questions, I could pull up images or slides to illustrate my answer even if they weren't part of my original presentation. The only problem I ran into with the software was that for some reason it was taking next to forever to resize images for television display (as was seen in the first 3 classes) on the laptop that I was using to run the slideshow. Fortunately I was able to fix the problem and images came up immediately for classes 4 & 5.

Next was software for running the video switcher. Conventional video switchers use an array of video monitors, one per video source, with rows of buttons (multiple per video source) to select which source is sent to a "preview" and the outgoing "program." Not only is this an ultra expensive way to do things requiring a ton of equipment, it isn't the most ergonomic way to do it either because you're having to look at video monitors directly in front of you and correspond them with a row of buttons away from your line of sight. So what I came up with was a touch-screen based video switching solution. Brad sat in front of a touch screen monitor which showed 16 video windows simultaneously, and simply had to touch one of them to select it to go out to the program feed. Of course it also allows selection of the type of transition between video sources (in the case of this class we used cuts between camera shots and dissolves between graphics). Having such a simple interface is what made it possible for Brad to run both the video switcher and audio mixer at the same time, otherwise running a video switcher is an all-consuming task. Again, I'm not aware of anyone doing anything like this anywhere, so a fully custom piece of software had to be written. As the class went on this software evolved to the point where the DVD players with the graphics could be controlled right from the touch screen interface, and I even added the capability of automatically selecting the slideshow video source as I brought up the slides. Very cool stuff if I do say so myself.

There were other smaller pieces of software that I wrote to make the class work. Of course the color wheel software (download link) that I showed on camera was something that I threw together, and I mentioned earlier the software I created to keep track of students and their schedules. There were other little things, though, like a small program to display the "The program will begin in…" screen before the start of the webcast had to be created.

After each class was held, I had to capture it on the computer for editing. I haven't gone back and cleaned any of it up, mind you, but capture is done in real time, which gave me a chance to watch each class to critique my teaching and look for any holes in the content. After each was captured I added closing credits, encoded it into WMV format for upload to the internet (encoding each episode took about 6-8 hours) and I would then attempt to upload it to Google Video, a process that in and of itself took about 3 hours to complete.

So long story short, for each one hour episode of the class, there was about 3 days of work to create and present it. If I were to offer any advice to anyone thinking of creating a video web or podcast, I would say to do it, but only as long as you are enthusiastic enough to take on that kind of a load and have the time to do so. I had a ton of fun working on every aspect of it, and if I could guarantee that I'd get higher levels of participation from class members and continue to get such great help from my friends I would definitely do another series on something else. Who knows? Maybe one of these days I'll end up doing a class on just Photoshop, or writing software, or running a home recording studio, or making videos. Maybe even a webcast on how to create a webcast.

I'd love to hear your comments.

Insensitive Comments & Taking Offense

The first week I started going to a new ward (church congregation for the LDS-uninitiated) a woman there made a comment to me, that, at the time, I didn't think much of, but looking back what she said could be taken by some as very insensitive and could be considered offensive. Her comment to me was "You need to get married!" (original emphasis), which is something I have heard before, but her tone of voice while she said it seemed to indicate contempt for my single status.

Like I said, I didn't think much of it at the time. And I'm not one to ever take offense at anything. But a while later after thinking about it a little more I began to realize just how insensitive a comment like this is. Not only was she implying that there is something wrong with me because I'm single, she is flat out telling me that it's my fault because I haven't done anything about it. Obviously she knows nothing about me and certainly isn't qualified to offer any advice on my behalf, but what she fails to see is that it isn't (entirely) my fault that I'm not married. There is only so much that I can do about it; it literally takes two to tango. A far less offensive comment (and possibly a snappy, suitable reply) would be: "you need to lose weight!" because, let's face it, that is (usually) something that an individual can do something about themselves without relying on participation from another individual. "You need to get married!" certainly doesn't help the situation, and no matter how hard I try I can't fix it by myself.

I'm not here to rant about this woman's comment so much because we all say things sometimes that can be taken as offensive. But I do believe that we need to "think before we speak" a little bit more.

On the other side of the coin, I also believe that we in general take offense way too often. If you think about it, being offended by someone is a form of justification of hatred toward them (stop and think about that for a minute) and certainly doesn't get anybody anywhere good. What benefit do we have from taking offense? I certainly can't think of any; it has the opposite effect. We have enough problems in the world without adding to it by taking offense at comments made by others. So if someone says something offensive to you, the best thing is to just let it roll off your back and move on. Don't waste any time and energy thinking about it, or even worse, doing anything about it.

Originally this post was going to be a long discourse, but I think it just boils down to being careful what we say, and never taking offense at others' comment. Just don't do it.

Sunday, November 18, 2007

Zune, Zune, Zune!

Anyone who knows me knows that I love electronic gadgets. And anyone who knows me might be surprised to know that I don't own an iPod, and until now, I haven't had what I would consider a real MP3 player. Sure I bought a Toshiba Gigabeat a couple years ago, but it was too limited and ended up not being used aside from occasionally on a plane when the music I wanted to listen to wasn't on my phone, and it will soon be going up on eBay. But as of this week, I finally dived in and bought myself a music/video player… an 80GB Zune.

People have asked me over and over again why I don't have an iPod, and well, there are many reasons. But the primary being that I ripped my CD collection using WMA format before the iPod was even on anyone's radar, and Apple has chosen to disable WMA support in the iPod (it's actually there, they have just turned it off), leaving them virtually useless to me. I'm not about to re-rip 1100 CDs, and conversion to Apple's AAC format is out of the question due to loss in quality in the process and the amount of time it would take. The other big issue is that the professional audio applications that I use (Adobe Audition, Cakewalk's Sonar for example) don't support AAC. If you're happy to live in the iTunes/iPod/iLife world, the iPod is probably fine for you. But I do far more with my music, and the whole Apple "i" world is way too limiting for me. (For example, the structure that iPods use to store music makes it nigh impossible to copy music back off of the device, a capability I need to have.) Long story short, I would never be happy with an iPod.

Every other device I have looked at has had deal-breaking limitations. The most common is storage. The portion of my music library that I would like to keep with me is about 60GB, and most manufacturers have chosen to not create a device with this capacity. And most of the ones that do make large (almost unwieldy) devices. But when the Zune 80 was announced, I was intrigued. I didn't like the original Zune at all; Microsoft typically doesn't do well with a 1st generation product, but they typically are able to get it right on the second or third version. I suspected that they had probably fixed most of the problems with the original Zune with the new version. And Indeed they have. The new device isn't perfect, but it is very well done. Microsoft got it right this time.

The new 80GB Zune is nearly identical in size to the 80GB iPod, making it small enough to not give you trouser bulges, but at the same time it is built with enough heft to feel like a robust product. The screen is beautiful: bright and colorful, and compared to the screen on the iPod Classic is absolutely huge (64% larger, with it approaching the size of the screen on the iPod Touch)! The new "squircle" control is sensitive to both directional pad-style button pushes, and also to touch; you can swipe your finger over the squircle to scroll through music, adjust volume, etc, and it is very fast. Some may still prefer the click wheel of the iPod, but the touch interface of the Zune is as good; it's a matter of preference, not capability. The WiFi feature to "squirt" songs between Zunes, inherited from the original version, is still there with a few limited additions, but they have added a really neat "WiFi sync" feature that allows me to synchronize the Zune with my desktop computer without plugging it in to the USB cable. It even synchronizes each time I place the Zune in its (optional) dock, which isn't even connected to a computer (in my case its just connected to an AC power source for charging). Very cool trick; there isn't really any reason to connect to a computer any more.

Navigation on the device is fast and easy, and is more flexible than the iPod. For example, after you select an artist, you can very quickly move to the next artist with a single right/left click of the squircle. You don't have to go "back" to the previous menu to choose another artist. Also, when selecting an artist, the Zune displays both the albums and tracks in a single list, with the albums listed first, making it easy to find a song if you don't know which album it is on. Of course you can drill down to individual albums, but you don't have to. Using these two features simultaneously effectively allows you to navigate through albums or tracks by artist with only a single button press required to change artists. This is very powerful and makes navigation very fast.

The included headphones are quite good for earbud-style 'phones, though they don't really compare to the Shure E3's that I usually use for listening to music. They are, in my opinion, better than the ones included with iPods; the sound is more full with more accurate bass and clearer high frequencies. They have an unusual characteristic, however, in that they must be inserted a little farther into the ear canal than other earbuds I have used in order to get the best sound. Fortunately they are comfortable when inserted properly.

The software has been completely rewritten from the ground up compared to the first Zune, which is definitely a great thing. The original software was basically a hacked version of Windows Media Player, but the new Zune software is a brand-new product, and it is very well done. Not only is it easy to navigate, it is very pretty to look at, complete with high quality animations while navigating. They have added Podcast support in this version (a major omission previously), a very welcome addition. It uses a three-column view for artists, albums, and songs, which gives some interesting navigation options. For example, clicking an artist shows you all of their albums in the center column, and songs in the right column, so you can very easily and quickly get to the music you are trying to find. But if you click on a blank area in the artist column, it goes back to showing all albums and tracks again. The search feature isn't especially fast, but it is effective, dividing search results into artists, albums, and tracks, eliminating the need for separate searches, or filling in multiple fields in a search screen. Marking music for synchronization is easy; just drag the album, artist, or track to the Zune device logo in the lower left corner of the window. Viewing, playing, and managing music already on the Zune is done on the "device" screen, and it again uses the same 3-column view. Very simple and very easy.

There are a few things about it that aren't obvious, though. If you play a video, the navigation interface disappears and the full window is used for video playback. This is fine, but after clicking the Exit button to get back to the navigation screen, it isn't immediately obvious how to get back to your video, even though it is still playing in the background. There is an equalizer-looking icon in the lower right corner that you click to restore the playback screen.

One thing that nobody has done right yet in device synchronization software is a simple one-click sync option. In my opinion, the best way to handle synchronization would be to place checkboxes next to each artist, album, and song, with a checked state indicating "yes, I want this on my portable device." The Zune software at least shows a small device logo next to anything that is on the player; it just doesn't allow this to be toggled on and off with a single click.

There are other things missing, too, and one or more of these may be a deal breaker for some people. The Zune Marketplace software, does not, for example, have any video (TV shows or movies) available for download like iTunes does, and Audible does not currently support the Zune for its audio books. As far as I know there are no ways to make a car stereo control a Zune.

A few final "plusses" before I go, though. The hard drive based Zunes (30, 80GB) do not require a special cable to connect to a television unlike the iPod Classic; any standard 1/8" A/V cable will work. The A/V dock comes with component video outputs for connection to a high definition television. Very cool.

Overall I think Microsoft has done a great job on the new Zune. Anyone who bought the original Zune would have had good reason to be a little sheepish when telling others of their audio player choice, but with V2 I think Zune owners can finally be proud of their purchase. (Fortunately, for the original Zune owners, they can be firmware upgraded to incorporate the new features and use the new software, all for free!) Compared to the iPods, feature-wise it comes in somewhere between the Classic and the Touch, but it is priced identically to the Classic (or Nano, if you are talking about the 4/8GB Zunes).

Wednesday, October 31, 2007

Megapixel Myth

Something I have wanted to cover in my Photography Class that I just haven't had time to do is the myth of the megapixel when purchasing cameras. When people ask me about my cameras the first question that inevitably comes up is "How many megapixels" it has. Somehow camera manufacturers have tricked the general public into thinking that more megapixels in a camera equates to a better image. Unfortunately this is far from the truth.

Allow me to describe what is happening in a digital camera so I can explain why "more megapixels" is probably a bad thing. Digital cameras work by focusing an image onto an analog image sensor that converts light into electrical impulses (similar to the way the human eye works) in millions of tiny sites called pixels. These signals are processed by another chip and are converted into a digital image that is stored on a memory card. Unfortunately this process isn't perfect, and has a few problems. One of those problems is electrical noise in the image sensor itself, and that electrical noise shows up as random variations in color and brightness in individual pixels in your final picture.

In my first class we talked about ISO, which in a digital camera is effectively the sensitivity of the image sensor. One might think that more sensitive is better, and in theory this may be true in certain circumstances. However, to accomplish higher sensitivity, the camera uses amplification of the signal coming off of the sensor, and the result is in the random noise I just mentioned. However, the amount of electrical noise in the chip is more or less constant no matter how much light comes into the camera, so by dialing up the sensitivity you get more noise and less signal from the light entering the camera. The ratio of actual signal to electrical noise decreases as you turn up the sensitivity; less signal to a constant amount of noise. High ISO settings result in noisier pictures.

Another way to increase sensitivity of a chip is to make the pixels larger so they can capture more light. Since the electrical noise in the chip remains more or less constant, more light means a cleaner image. But to make larger pixels means you either have to make the chip larger, or cut down on the number of pixels in a given area. But the trend among camera manufacturers is going the other direction… stuffing more pixels into the same size chip (larger chips are considerably more expensive to manufacture and require larger lenses, also very expensive to make). The result? A smaller surface to capture light, which means an increasing amount of noise in the pictures we are getting out of newer cameras. And the higher levels of noise mean that the camera has to do more work to try to remove that noise, and removing noise also means removing real image detail; the camera can't discern between the two. So your final output ends up being a low resolution noisy mess.

More megapixels would be fine if the sizes of the chips and lenses were increasing. But there is another disturbing trend going on there; the lenses and chips that cameras are equipped with is actually decreasing rather than increasing because the cameras themselves are getting smaller and smaller. Not a good thing. The level of noise has gotten so bad that the high ISO settings on most new cameras are basically unusable.

So in terms of raw specifications and the reality of what they mean, the newer digital cameras that are coming out now are actually inferior to their predecessors in more than one way. The Canon 40D that I just bought, for example, though it has a 10.1 megapixel sensor actually produces a visibly noisier image (at high ISOs) than the 20D that it replaces with its 8.2 megapixel sensor. (Fortunately though, because of the large sensor used in both cameras the level of noise is so low, or at low ISOs completely invisible, that I don't mind the tradeoff between additional noise and new features. The same can't be said of compact point-and-shoot digital cameras, however.)

If megapixels aren't the best indicator of the ultimate quality of the image coming out of a camera, what is? Turns out, it's the size of the lens. I'm talking about the glass itself, not the barrel surrounding it. Larger lenses let in more light, and lenses are generally matched to the size of the sensor behind them; larger lens usually means larger sensor. So when comparing two cameras with generally equal specifications, the one with the larger lens is usually going to produce a much better image. (It shouldn't be any surprise, therefore, that cameras with nice big lenses also cost more.)

So when shopping for your next digital camera, don't be swayed by the "megapixel" number on the specification chart of the camera. If you fall for that trap you may end up with subpar pictures and end up paying more money for the privilege.

P.S. One of these days I might discuss how all camera manufacturers are lying to you about the number of megapixels anyway, overestimating by three times the actual number, but I'll save that for another day.

Monday, October 8, 2007

Photography Class!

If you haven't heard, I guess this is your official announcement that I am teaching a photography class. There are 7 classes, I'm teaching one per week, and we started last Thursday (Oct 4th). To make it most worth my while I'm streaming it online and making the classes available via Google video afterward.

Visit the official class web site:

The first class is now available on Google Video:

Hope to see you online or at the class!

Sunday, August 19, 2007

Buying a Laptop, Part Deux

As a follow-up to my previous about buying a laptop, I finally settled on a Dell XPS M1210 12-inch super portable. It isn't quite an ultra portable, but it is small and light… right at 4 pounds. And it doesn't skimp on its specs either… Core 2 Duo T7200, 2GB of RAM, 120GB hard drive, GeForce Go 7400 video… almost identical to the Dell Inspiron E1705 I bought back in March.

Moving on…

My purpose here isn't to tell you about my quest for my 2nd laptop purchase of 2007, but rather to give a little advice to those currently looking for a new laptop, based on what I found during my search. This information won't be valid for long since everything changes so fast, so if you're reading this much after it is posted, email me and ask me for updated advice.

I'll make this simple, putting the most important factors first…

Rule #1: Screen size is the largest factor in the base price of any model. And it doesn't go from smallest to largest. Mid-size screens, 14-15" tend to be the cheapest, with the price going up as you go bigger or smaller from there. Screen size is also the biggest determining factor in the size and weight of your machine, so choose based on how you are going to use your machine. Do you need a big screen? If so, are you willing to carry the extra weight and work with less battery life? If you want small, are you willing to pay the price premium for the privilege and possibility of a small battery? Would working on a small screen with low resolution start to become a problem after a while? If you are someone who likes to run lots of programs at once, consider the bigger screen.

Rule #2: The apparent speed of a machine is very much affected by how much memory (RAM) it has. If you are getting a computer with Windows Vista, 1 GB of RAM is the absolute minimum, but I highly recommend you get 2GB. XP needs 512MB absolute minimum, 1GB or more ideally. Memory is very easy and relatively inexpensive to upgrade (around $100 to upgrade a new computer to 2GB after it has been purchased), so don't let insufficient memory turn you off from a computer that is otherwise exactly what you are looking for. When a computer doesn't have enough memory (RAM), it is forced to use storage space on the hard disk drive, which is literally thousands of times slower, so we really want to avoid that.

Rule #3: Get an Intel CPU, not AMD. Just a year ago I would have told you the opposite, but Intel has made leaps and bounds in performance and value, especially when battery life is taken into consideration. And stay away from Pentium and Celeron models; they are SLOW by today's standards. The current best value is the Core 2 Duo, T7x00 series. These chips are fast, are pretty good on the battery, and don't add significantly to the price tag. You'll actually see a difference in performance with one of these chips compared with some of the other available alternatives. The Core 2 Duo T5x00 series is noticeably slower, and doesn't save much money, and the Core 2 Duo T2x00 series isn't really an option I'd consider. The real principle here is the FSB, or Front Side Bus speed of the chip. Chips with a faster FSB (667Mhz or faster) operate significantly faster than chips with a slower (533MHz) FSB. Often you won't find the FSB listed on a computer's specifications, despite the fact that this one number has the greatest impact on the computer's speed more than anything else. The T7xxx series has the faster FSB, and a larger cache, for much better performance than other options.

Rule #4: Go with Windows Vista, even if it means a few inconveniences in the short-term. We're going to start seeing some software in the next year or so that requires Windows Vista. Two years from now that will be more common. Windows XP may be fine for right now, but it will become more limiting in the future. As far as which edition of Vista to get, Home Premium is probably your best bet. Vista Basic isn't really an upgrade from XP (downgrade really), and Vista Ultimate doesn't offer any significant advantages given its premium pricing.

Rule #5: Get Antivirus/Antispyware Protection. I really like Spyware Doctor from PC Tools. It even includes Antivirus software now too. If your computer comes with Norton or McAfee, remove it and get Spyware Doctor instead.

Rule #6: Mac vs PC. I would say that unless you specifically need a Mac, buy a PC. The Mac has less software available for it, and sometimes you'll run into problems sharing files with PC owners. If you buy a Mac hoping to run Windows software, it can be done, but you will be buying a copy of Windows at an absolute minimum to pull it off; budget $200 for that. Simple enough, right?

Rule #7: Don't buy used. Kind of ironic advice considering I'm selling two of my old laptops, but it's really true… You never know how a computer has been treated, and a laptop that has been used regularly probably has an actual usable life of about 2-3 years tops before it just falls apart and quits working. Used laptops don't save that much money over new, and you'll certainly have to buy a new battery with a used laptop, taking away from an already small amount of savings over new.

Rule #8: If you use your laptop on battery, buy an extra. You'll need to replace it after about 12-18 months anyway, so just get it up front when you can usually get it cheaper as part of the initial purchase.

Rule #9: Hard drive size isn't that important. Unless you are going to be storing or editing video on your computer, hard drive size just won't matter that much. Bigger hard drives tend to be slightly faster than smaller drives, but you probably wouldn't notice the difference. The smallest you see on computers these days is 80GB, with 120GB being "normal," and anything larger just being a bonus. Unless you know you need the extra space, just ignore the HDD size.

Rule #10: Warranty. Unless you specifically pay for it, things like damage from dropping or misuse, or damaged screens aren't covered. Warranties tend to get expensive past the first year, and I'm not convinced it's worth it, especially if you know someone who is capable of swapping out broken parts.

Rule #11: Get a good case! The most common problem with laptops is broken screens due to their owners not taking proper care of them – packing them into backpacks that aren't designed to protect a laptop for example, or knocking them off the couch onto the floor. Unfortunately, the screen is also the most expensive part of the computer too. Buy a good case that will protect your new toy!


In short, get a computer with a Core 2 Duo CPU, T7x000 series, 2GB of RAM, and with a screen size that is appropriate for what you are going to be using it for.

Apathy, Ignorance, and Sound Quality

I may not be a purist audiophile, but I do consider myself a mainstream audiophile. What I mean by that is that I enjoy good quality sound, but not to the point where I blow tons of money (defined "tens or hundreds of thousands") on each stereo component getting the best sound possible like some people do. I think I still can understand what the average Joe on the street wants from their electronics, even though what I want may be a little different.

What blows my mind, though, is what content providers ("music companies") and electronics companies are trying to pass off as "high quality" sound these days. Things are being advertised as "CD quality" that aren't anywhere near the quality that we get out of CDs. MP3s are so common place that many consumers think that the sound of MP3s is normal, or even good, while at the same time those of us that have been exposed to good quality sound cringe at the sound of a typical MP3 file. While some attempts have been made to improve upon what is considered good sound (Apple, for example, uses AAC for iTunes/iPods, which is better than MP3, though it is still lacking), for the most part things have gone downhill—and done so very quickly.

I invested in an XM satellite radio several years ago, near the time of the initial public launch of the service, and it was one of these products being touted as "CD quality." And at the time, it was quite good considering the technology they were working with. Not stellar, by any means, but certainly better than the average FM broadcast, and I knew it wasn't going to be stellar, so I was happy with it at the time. It was definitely closer to CD than FM radio in terms of it sound quality.

What has happened since then is nothing short of appalling. As XM has added more and more channels since that time, they have gradually taken bandwidth away from the existing channels, reducing the sound quality of those channels to something that only a half of a notch above pathetic. (I presume that the bean-counters decided they could make more money by appealing to a wider audience by offering more channels, assuming that the average Joe is Ignorant to sound quality issues.) And yet they continue to market it as "CD quality." These days FM radio sounds better. Things got a little better two weeks ago with a new upgrade to the XM encoding systems, but they are still far from spectacular, or even acceptable if you ask me. These days both the low and frequencies are pretty much gone, and what is left is compressed so badly that it all gets merged into one big jumble of wishy washy highs, muddy midrange, and come-and-go lows, leaving everyone guessing what instruments are actually being played and what lyrics are actually coming out of the lead singer's mouth. And they have effectively taken away our stereo image, sending us back into the 1940s with what is, essentially, monaural sound. Yeah, great technological innovation.

If XM was the only company that had fallen into this trap there would be no issue. The problem is that this reflects the attitude of just about everybody.

Because some of this transition has taken place over the course of a period of time, a lot of consumers are just ignorant of it. And then there is another group that is aware of it, but is apathetic. Shame on the penny-pinchers behind it, and shame on those that are apathetic. Those of us in a third segment that actually care about the quality of our audio are suffering. Our voices are not being heard, or are being drowned out by the shouting of the wallets of the poor ignorant and apathetic consumers. We can't even get good quality sound when we try.

Music is all about conveying thoughts and emotion. And a lot of that emotion is missing when the quality of our sound is taken away. Have you ever noticed how much more exciting it is to see a band perform live than it is to listen to a recording? A lot of that has to do with the faithfulness (or lack thereof) of the recording we are listening to. Listening to a good recording on a good quality sound system is an emotional experience. By taking away our high quality recordings and reproduction, the "emotion" half of the music equation is being stripped away from us. It's no wonder that a lot of the music that is coming out today lacks emotion, because if it was ever there in the first place it wouldn't make it to our ears anyway. A hundred years ago we didn't have the option of listening to music in our homes; we had to listen to a live performance, and it was a much more enjoyable experience. That begs the question, has current technology really improved our lives musically?

I don't have any sort of answer to the problem, but it is, indeed, a problem.

Ironically even though products are being marketed as being "CD quality," CD quality isn't that great to start with. Not only because the human ear can detect nuances of sound that CD simply isn't capable of recording (part of the "emotion" of it all), but also because the CD players that most of us own (or with the prevalence of iPods and such these days, the CD players we once owned) don't do a very good job of maximizing what is there. The two formats that have been designed to take care of that problem, SACD and DVD Audio, have pretty much failed at this point. The CD format has now been in consumers hands for 25 years, and it was designed within the limitations of technology at the time. We should expect far more than what CD has to offer, not be comparing other products against it.

The driving factor behind all of this is, of course, money. We, as consumers, want the most out of every dollar that we spend. And those that produce the products that we own want to make as much profit as they possibly can. And that means cutting corners.

I know I'm a little bit fanatical about all of this, but I don't think I'm that far off from someone in the mass population if they were exposed to the high quality stuff that is out there. The problem is that we keep having low quality products and content shoved down our throats, keeping us away from what can truly be a grand experience. A very dangerous precedent has been set.

I'll step down off of my soapbox for now… but next time you have the opportunity to listen to a good piece of music on a good sound system, take the time to actually listen and enjoy it. It just might open your eyes ears to something you might really love. And take a minute to drop an email to a music company, or service, and let them know that you don't appreciate the shortcuts that are being taken. For right now they might still be able to hear you, but if things keep on going downhill like they are now, it might not be long before your words turn into the same total mush that they are already trying to shove down our throats.

Monday, August 13, 2007

I Killed a Tree!

I purchased an Epson all-in-one Printer/Scanner/Copier tonight at CompUSA (model RX580 if you're curious) and both the clerk and I were quite surprised at the receipt that printed out… In addition to the normal receipt, it also contained five (5!) rebate forms. It was quite the roll of paper I had in my hand as I walked out.

I came home and measured it and it was 145 inches long. That's just over 12 feet! Have you ever seen a receipt that is twelve feet long… for one item? It's amazing! I feel like I killed a tree just with that receipt.

The worst part is that two of the five rebates are only valid when the printer is purchased together with other items. Which I, of course, did not purchase. You'd think their system would be smart enough to realize, "Oh, this rebate doesn't apply; not all required items are being purchased." But no… CompUSA is killing trees.

Monday, July 2, 2007

Eleven Tips for Great Engagement Photos

People ask me to do engagement photos for them all of the time, and I'm happy to do so. When planning your engagement photos, here are a few tips I came up with to make the process go more smoothly and give you better results.

Relax. The best pictures are ones where the couple looks comfortable, like they are enjoying being together. While I can offer suggestions for poses and will coach where necessary, just being yourselves usually gives the best results. Don't pick a time where we are under the gun to get everything finished; this creates undue stress, and it shows.

Clothing. My motto is that basic is best. I suggest solid colors, or even plain white. In most cases it is best to avoid patterns and black. Keeping it basic means the focus of the picture will be you rather than your clothes. Outfits do not need to match, but they should complement one another and have similar styling. Bring at least one change of clothes for some variety. And clothing should stand out from the background – don't wear forest green hoping to stand out from the trees.

Location, location, location. Engagement photos are about you, not where you have your pictures taken. More of the background in the picture means less of you. We want the focus to be on you instead of your environment, so it is probably best to not worry too much about where we take your pictures.

Lighting is everything. The key difference between a professional photo and an amateur photo is lighting. This means that for outdoor pictures we want to avoid the noonday sun's harsh shadows, and stay away from pictures at night. Pictures in the shade, taken on cloudy days and just before sunset are best for natural lighting. While I can use the flash to make up for some deficiencies in lighting, all natural is best.

B&W or Color? If you are considering doing pictures in black and white (or sepia) we need to take picture detail into account more than if we are working in color; fine detail is much more obvious and distracting in black and white. In addition, many colors appear to be identical, and we don't want you to blend into the background. With black and white contrast is key.

Announcement look? If you have an idea of the size and what you want the announcements to look like, let me know and I'll do my best to make the photos work within those requirements.

Let's Take Lots! If we shoot a lot of pictures, you'll have more to pick from, and better likelihood of finding something you like. Plus, you'll have pictures that can be used for other occasions, such as at a reception or photo album. I shoot digitally, so we aren't wasting any film.

Allow Plenty of Time. Not only should you allow plenty of time on the day of your photos, but allow plenty of time afterward for fixing in Photoshop and printing. These things take some time to finish. We shouldn't be taking your pictures days before the announcements need to be sent out. I very much prefer to spend some time tweaking photos for the best possible look.

Information, please. If you have things that you do or don't like about pictures you've had taken in the past let me know and I'll work with it as best as you can. For example, if you like pictures taken from one side vs. the other, or hate your profile, I'd like to know. You know what you like better than I do.

No Catch-22s. I have found that people who don't like having their picture taken have had bad pictures taken in the past, so they get nervous in front of the camera which results in more disliked pictures. If you relax and forget the pictures you have taken in the past, you'll get better results.

Bring a helper. Having an extra set of eyes and hands makes a big difference in how fast we can work and picking up on details that the rest of us might miss.

Thursday, June 28, 2007

Adobe Tech Support

My copy of Photoshop on my main computer asked to be reactivated tonight, and the Internet activation failed, so I had to call Adobe's telephone activation line, (telephone activation, of course, also failed), so they transferred me to a customer service representative.

I could tell the guy was just reading a script, but I let him walk me through the troubleshooting steps, answering his questions, etc., pretending to be a clueless user. We walked through everything his computer told him to, and none of it worked so he put me on hold while he talked to a level 2 technician. Their conclusion of what my problem was? That I have USB ports on my computer. He wanted me to remove my USB ports in order to use Photoshop. He was very adamant and specific that it was my USB ports, not a device plugged into my USB ports.

Anyway, after explaining to him that I can't remove my USB ports he went ahead and gave me the reactivation code, and it worked. But oh my gosh… Remove the USB ports… That's really a new one.

If I was thinking a little more clearly I should have asked him the best way to do it. Should I use a hammer and chisel? Or is a jigsaw a better way to go? Perhaps chewing them off of the motherboard… or a little C4 smashed into the ports would take care of it. But what if none of those methods are quite precise enough? Maybe I should use an angle grinder instead, or rent a laser cutter, so I only have to remove that one part of the motherboard. After all, we wouldn't want to damage any of the motherboard's other components; we're only trying to remove the USB ports. What is the next step if I can't get Photoshop to load after removing my USB ports? And after the USB ports are gone, where do I plug in my mouse? Should I whack off the plug and duct tape it directly to my video card? After all, the mouse pointer shows up on screen, so it must be connected to the video card, right? The USB port is there only for convenience so I don't have to open up my computer to plug in my mouse. There's really no need to have a USB port getting in the way, preventing me from using Photoshop or any other Adobe software for that matter, they're just trouble. And when I want to copy pictures from my digital camera to my hard drive, I'll just epoxy that cable there. Let's just bypass the problem altogether, shall we?

Monday, May 7, 2007

My Wii Little Console

Well, I finally did it. I finally managed to locate and purchase a Nintendo Wii. These things have been very difficult to find in stock anywhere and thanks to a tip from a friend I was able to get one at Wal-Mart this weekend. They only had 3, and I was there at the store when they brought them out, and was able to get the last one. While I was out and about I also picked up an extra Wii remote controller and a classic controller so I could play with someone else at the same time, and play games using the 'virtual console' feature of the Wii.

After setting up the console and a rather lengthy update process, the Wii was finally ready to go. I have to admit that, though I knew the graphics on the Wii can't compare to the other current generation consoles, that they didn't meet my expectations. Fortunately things improved quite a bit when I hooked up the Wii using a component video cable rather than the included composite cable, but honestly the graphics are still disappointing. I know that the hardware is supposed to be more powerful than the original XBOX or PlayStation 2, but with what I have seen so far it doesn't really appear to be taking advantage of that power. Neither the Wii Sports game included with the console nor Super Paper Mario, which I purchased earlier today, seems to show off what the console is actually capable of doing.

The Wiimote

The other thing that immediately became obvious is that the Wii remote control (or 'Wiimote') is very different from anything I have ever used before. I knew that, but I didn't really "know" that. Some motions you make with the remote are very intuitive and natural, others aren't quite so much. For example, most of the games in Wii Sports are pretty easy to learn, because the motions you make with the Wiimote correspond very well to the motions used when playing the real sports. Baseball and Tennis in particular use very natural motions, and the motions used by Bowling are very easy to learn. I had quite a bit of trouble with Golf, as I found it virtually impossible to control how much power was delivered during a swing, resulting in the ball not going anywhere, or overshooting the hole by large distances. Boxing was reasonably natural to control, but I found that it didn't always respond when attempting to control my character. With all of that said, there are definitely times when the Wii doesn't accurately figure out what I was trying to do with the Wiimote and responds by either not doing anything, or doing something that I didn't want it to do. Over all though, the Wiimote idea works fairly well. And I expect that it will get better with time as developers figure out new and innovative ways to use this new control method.

Super Paper Mario doesn't really utilize the capabilities of the controller; it seems to only use the Wii-exclusive features here or there, with most of the control of the game utilizing conventional directional controls and button pushing.


You may have heard it before, but there are claims that the physical activity you must exert while playing the Wii will help to keep game players active and counteract the effects of sitting around watching TV or playing another game system. While the Wii does get you up and moving, at no point was my heart rate elevated, so I'm doubtful about how much benefit this may have. I did find, however, that after a while of playing Wii Sports that my shoulder and bicep began to be a bit sore. And as I write this a day later, they are still sore. I fear that I may develop a little Wii muscle on one arm while continuing to maintain the Wee little muscle on my other arm. I wish I had more coordination in my left hand so I could switch back and forth, but this certainly isn't the case.

Virtual Console

The Virtual Console feature of the Wii is pretty cool, and honestly, one of the main reasons I bought the Wii. It allows you to download games developed for vintage Nintendo, Sega, and TurboGrafx consoles directly from Nintendo for a small fee. Most games are between $5 and $10, depending primarily on their sophistication. I purchased Super Mario from the original NES for $5 and Mario Kart from the Nintendo 64 for $10. These games are 100% faithful to their original versions, and actually appear to be the original games running under emulation. They have even made sure that any glitches in the original game are included in the Virtual Console versions. The game selection at this point isn't huge, but this will change with time, and most of the all-time favorites are already there.

Playing Nintendo 64 titles requires either an add-on Classic controller or a GameCube controller.

A few negatives

Despite the fact that the Wii is quite fun to play, there are a few negatives. For example, as mentioned at the beginning of this post, the graphics on the Wii just don't quite measure up to expectations despite the alleged potential of the hardware. I won't hold that against the machine too much though because I know the hardware is capable of more and game developers are just now learning how to take advantage of the system.

Another is due to the innovative control system itself. Most games on the Wii use the Wiimote controller, but many also require use of the Nunchuk controller, and these are not included when you purchase a Wiimote. So you should spend an extra $20 to add a Nunchuck to each Wiimote, bringing the total for each Wii to the order of $60 per person. And if you play Virtual Console games, you may also need a Classic controller for each player as well, for another $20. So potentially you will be paying up to $80 for controllers per player if you fully equip the console.


The Wii is a fun console, no doubt about it. But it is, above all, just a game console. It shifts the game console paradigm slightly, but only slightly. What Nintendo has done is a brilliant job of marketing their product, and they are able to get it into the hands of people who might not otherwise buy a game console, by making it easier to learn to play, and keeping the price down below the other options out there.

I don't think that having the Wii will take me away from playing games on my Xbox 360 at all, as it certainly doesn't replace the 360 (nor would it take away from or replace a PlayStation), but rather it adds a new type of experience. The Wii is different enough that it compliments owning one of the other consoles rather than taking their place. It does things just differently enough that I can recommend a Wii to anyone even remotely interested in one.


Friday, April 27, 2007

Buying a Laptop…

I use two laptops. That might sound crazy to some of you, but they each have a purpose. Up until recently I was using a Dell Inspiron 9100 and Averatec AV4265. The Dell is a large heavy beast at approximately 8 pounds, but with an Intel Pentium 4 running at 3.2 GHz and 1920 x 1200 screen it is perfect for doing audio and video work away from home. The Averatec, on the other hand, is small and light – just over 4 pounds and small enough that when an 8.5x11 sheet of paper is placed between the base and screen (with the laptop acting as a file folder) the paper actually sticks out the front. The Averatec also lasts 4 hours on a battery charge – or at least it used to – so it really is a perfect machine when portability is key. But neither machine is dual core, and Vista has some annoying limitations on the Averatec, so I have been recently thinking about getting something new to replace both machines. Especially since most new laptops are dual core and that reduces the resale value of both of my current machines. If I sell now I'll get a lot more for them than I will even in a few months, so it really seems like now is the time to replace both machines.

Big & Powerful

The Dell hasn't been too difficult to find a replacement for. I have always liked Dell's offerings at the high end of the Inspiron line, so I started there looking into the E17005 / 9400 (they are the same machine despite the name difference), and again this seemed like a perfect fit. Not that I didn't look at other options… In fact I looked at a LOT of other options, and in the end, there was one feature on the Dell that I consider essential that I couldn't find on any other machines, and that is screen resolution. Dell offers a screen at 1920x1200, but I haven't found any other computers that match it. Since I use the computer primarily for audio and video production and this requires multiple programs to be open and visible simultaneously, I actually use all of that screen real estate, and it is very high on my list of priorities. I liked a few of the features on some of the other computers I looked at – most other laptops with a 17" screen have a full keyboard complete with numeric keypad (the Dells don't) for example, but in the end it was mostly about screen resolution and price. So the Dell is a done deal; my new machine shipped yesterday and will be on my doorstep in the middle of next week. Does anyone want to buy a Dell Inspiron 9100?

Small & Light

The Averatec is becoming very difficult to replace. Not because I am so attached to it, but because I can't find a suitable replacement for it, especially since I upgraded the RAM to 1280MB. I have, on the other hand, grown really attached to its size and weight. Its 13.3 inch screen and just over 1 inch thickness are perfect for me on a portable machine. For some reason laptops with a 14.1 inch screen get a lot bigger and heavier, and 12.1 feels just a little bit small but it isn't off the table as an option. The problem with staying with a 13.3 inch screen is that they aren't offered by a lot of manufacturers. Averatec is still selling the same 4200 series I'm trying to replace and doesn't offer a newer version with dual core CPU. Their 2300 series uses a smaller screen and runs a much more power hungry AMD Turion processor (and as a result have poor battery life), not to mention that I hate the keyboard layout on the 2300's (I don't want to press two keys for Home/End, or PageUp/PageDown!), so they don't have anything that fits me well for now. Sony has a few models in the 13.3 inch variety, but they are either bulky and heavy or outrageously expensive. I even considered getting an Apple Macbook and installing Windows on it but I can't live without a real right mouse button or its funky keyboard under Windows. (Should Apple fix that and increase the RAM I'd very seriously consider the Macbook.)

If I compromise on the screen and move down to 12.1 (again the 14.1 machines are just too heavy) I have a few more options available, but every one of them has something wrong with it, and it's usually the price. I only paid $900 for my Averatec when I got it, and I'm willing to pay a little more than that for a different brand, but I'm not willing to double it for what would in reality only be a small step up over what I already have. Dell's 12.1 inch XPS is too expensive, as are Lenovo's offerings. HP doesn't have anything in that category, and Fujitsu seems to be stuck with old processors. I don't trust Gateway right now – their machines might be okay, but I don't think I want to take that risk based on the frequency of repairs I've had to make on several Gateways for friends recently and the tech support horror stories I've heard. Toshiba has a couple of interesting machines in their U205 line, and they are a lot closer to where I want to be on price, but I hate their keyboard layout and their laptops can't be configured with Bluetooth, which I use a lot. (Who doesn't offer Bluetooth these days??!??!?) So far, though, the U205 is the closest I've found to a match.

For right now it looks like the Toshiba U205 is the closest thing I've found, but I'm still not excited about it, so for at least a little while longer I'll replace the battery on my Averatec and live with its limitations under Vista until somebody offers what I consider a good option at an affordable price, the price comes down on the U205 to the point where I can't say no, or my Averatec decides to die.


Mac OS X

I know I'm going to catch a lot of flak from Mac owners over this article. Mac owners tend to stick together and defend their choice of computer and operating system almost to the point of death. Discussions often get very heated and feelings often get hurt. Frankly I don't really understand why this is; a computer is a tool, and you usually don't hear many arguments about why a particular brand of screwdriver is better than another, certainly not with the same ferocity. And you certainly don't see television ads about how someone has switched from one brand of hammer to another or two guys making tongue-in-cheek comments about how one bench vice can do things that another can't. So if you are a Mac addict, I'll gladly listen to any comments you make about factual information, but I would really rather not hear prejudiced comments about how your computer is better than my computer without factual evidence to back it up, especially about why "Windows sucks." For what I do a PC is a much better alternative than a Mac. You may have selected a Mac for what you do with a computer, and if so, I'm happy for you. If the Mac OS does what you want your computer to do, great. But the Mac OS doesn't "just work" for me. And this article attempts to explain to some degree why that is.

Biggest Beef

My biggest beef with the Mac OS isn't so much with the Mac OS. It is with the Mac owners themselves. For some reason, as part of this close-knit community they have put together, many they feel they must pass along false information about what their machines can do that others can't. Or what their machines do better than others. I have a feeling that most of these people sharing this information don't bother to do their own research. An example: I know a lot of guys that work with video and film, and a lot of them use Final Cut Pro. I also work with video, and I have elected to use Adobe Premiere Pro. Final Cut Pro is a fine product (I have it on my Mac and I have used it enough to be able to speak with some authority on the subject) but it doesn't beat my Premiere Pro "hands-down" doing…. well, anything. Premiere Pro does nearly everything that Final Cut Pro does, and it does quite a few things that Final Cut Pro does not, including some very basic editing features that improve workflow tremendously. And Premiere Pro does NOT crash more than Final Cut Pro. Both pieces of software will crash from time to time, about equally. To the contrary of the rumors being passed around, Premiere Pro has been more stable and consistent than Final Cut Pro for the things that I have done – considerably less weird behavior out of the Adobe product. Please don't get me wrong, I'm not trying to bash on Final Cut Pro, because I do think it is a good product, but if you were to listen to a lot of Final Cut Pro users you would think that it does everything you could ever want with video, plus squeeze your orange juice and make your morning toast, without you lifting a finger. Please, let's be realistic here. A piece of software is a tool, and no piece of software is perfect. Some are used differently, some are faster at performing certain tasks, but in the end, it is just a tool. So if you are a Mac owner making such comments, please do a little research before perpetuating false rumors.

A great example of the self-perpetuating rumors is the recent television ads produced by Apple. In one such ad you see the "Mac" talking to a new digital camera from Japan easily because the Mac speaks its language, whereas the "PC" is not able to talk to the camera quite so easily (requires drivers). This is a classic example of highlighting just one particular situation, where other examples turn the tide the other direction. If the ad had selected a webcam instead of a digital camera the conversation would have been going the other direction: finding a webcam that will work with a Mac is quite difficult unless you are purchasing Apple's own iSight camera, whereas on a PC, many webcams "just work." And I have yet to find a digital camera that doesn't "just work" on Windows XP. But of course you will never hear this publicly discussed by Mac owners.

My Own Experience

This last summer I actually purchased a Mac Mini of my own so I could help friends who are Mac owners with projects, and to find out what all of the hype is about. I knew a few things about OS X going into it but I tried my best to lay those aside and keep an open mind as I learned a new computer environment. Initially I spent about 3 full days with the Mac, and have spent quite a bit of time with it since then as well.

For the most part everything looks nice on OS X; it has a level of esthetic appeal that you typically don't find with other operating systems until Windows Vista was released. Apple has always been pretty good about keeping things simple and clean, and OS X is no exception here. I didn't find any examples of "there are too many options to choose from" anywhere in OS X itself. For a new user this was nice; I didn't have to spend too much time reading each screen to change the settings I was looking for, and thus the intimidation factor of the computer was fairly low, though I don't think I would turn someone who is totally computer illiterate loose on it without some over-the-shoulder direction. Everything was going pretty well for a while, but I must say that I began to get a little bit frustrated fairly quickly.

I've been a Windows user for many years now and I have grown accustomed to having a certain level of customizability, and with the Mac, well, it just isn't there. You can't change the system font, or the system colors, for example, out of the box. If you want to do such things you need to download and purchase third-party software. And speaking of fonts, I don't think the font rendering in OS X is anywhere near as good as what you find in Windows; everything looks a little bit blurry, like I forgot to put in my contacts, especially when text is small. A lowercase "H," for example, at 8 point in the default system font appears as a gray box with a gray vertical tail and no clear lines whatsoever. (On a related note, a friend of mine who uses the Mac had spent hours poring over some code trying to figure out why it wasn't working, and it turned out to be that he had used a period where a dash should have been used, and he couldn't see it because on his Mac the two were virtually indistinguishable.) In contrast, text in Windows is always crisp and easy to read; there is always a clean edge on each character, whereas on the Mac there is a lot of gray added to the edge of text, especially at smaller sizes, making it appear blurry and harder on my eyes. And since you can't customize the system font, you are pretty much stuck what Apple has selected. This probably isn't a big deal to most people, but something that I do believe is worth noting and caused me a little grief.

Software Installation

One fundamental difference between the Mac and PC is the way that software is installed. On the PC virtually every piece of software comes with an installer that you run to go through all of the steps required to get the software working on your computer. The Mac also has an installer but not all software developers use it; some applications will come packaged in such a way that you drag and drop an icon from a .DMG disk image file or a CD into your Applications folder. This is pretty easy to do, and once you figure out that this is what you are supposed to do it works pretty well. The problem here is the way that software is uninstalled. On the Mac you drag the application package (the icons are actually packages of multiple files, more like a folder) to the trash. This is a little disconcerting for users of other operating systems, but we won't hold that against the Mac in any way. It does, however, leave behind the settings files for the application. And if the application used the installer instead of just being dropped into the Applications folder, anything that the installer added outside of the Applications folder is left behind as well. With the apps I installed on my Mac I think I only found one Uninstall program in everything that I installed despite the fact that quite a few of them used the installer. So I have absolutely no idea how much stuff has been left behind after performing an uninstall. I know a lot of PC uninstaller software leaves preference and user files behind, but they, almost without exception, remove any miscellaneous program bits they added as part of the installation, so you are left with preference and user files only. Overall I'd say that the experience on Windows was more consistent—applications almost universally use installers and come with uninstallers, whereas on the Mac there are two main ways to install software, and from what I have seen, no consistent way to do a clean uninstall, so bits and pieces do get left behind to some degree. Again, not a huge deal, especially if you are someone who doesn't install and uninstall software frequently, but something to be aware of.

Bigger Disks Required

This brings me to another issue that doesn't get any attention. Since Apple recently switched from a Power PC-based architecture to an Intel x86-based architecture, Apple and other software developers are producing what are called "Universal Binaries" that include native code for both the Power PC architecture and the increasingly common x86 architecture. These Universal Binaries tend to be quite a bit larger than their single-architecture cousins because they contain code for both platforms, so recent application software files are considerably larger than what we have seen in the past on either the Mac or the PC. With a large enough hard drive and a fast enough Internet connection this probably will not be an issue, but if you purchase a Mac with a small hard drive, be prepared for it to fill up quickly. The 60GB hard drive in my Mac Mini was filled to capacity very quickly (to the point of burning some of the included software onto DVD so I could continue to work), even though I never copied any of my documents, music, pictures, or video files to the Mac, whereas on my Dell laptop with a 60GB drive running Windows Vista, I have all of the document files I have created over more than a decade, a large collection of pictures taken on my 8 megapixel camera, a significant library of music, and a couple movies (in addition to a large selection of installed software) with 20 GB of space to spare. Disk space gets eaten up more quickly on the Mac, so if you buy a Mac buy a bigger hard drive than you would on a PC.


One rumor spread throughout the Mac community in the past has been that despite the lower numbers for computer processor speeds that the Mac is just as fast, if not faster, than a PC performing similar tasks. We don't hear a lot about this these days now that the Mac OS runs on hardware very similar to that which runs Windows where an apples-to-apples comparison would be easier to make. Since the newest Macs run Intel-based hardware, you can install Windows on Mac hardware and it runs natively. As a result of this, you can really get a good feeling about the difference in speed between the two operating systems. On my Mac Mini, OS X is sluggish. Quite sluggish. Sluggish to the point where I'm often not quite sure if it registered a click to open an application (there is no visual indicator on screen that a program is loading), and often have to wait several seconds when switching programs. I timed several applications startup times and they were a lot longer than they should have been. Firefox, for example, took 22 seconds to get to a point where I could enter a web page address. Loading Mail and Address Book took 17 and 15 seconds respectively. Now, the Mac Mini hardware I purchased shouldn't be slow. It runs an Intel Core Duo processor, which anyone who knows PC hardware will tell you is quite fast by today's standards, yet everything that I do in OS X seems slow. I thought at first that maybe it was because the Mini only came with 512MB of RAM, so I upgraded it to 1 GB, but that had no effect on performance; everything still takes a while to respond. What gives here?

I had quite an eye-opening experience when I installed Windows XP (and later Vista) under BootCamp … XP was FAST! Nearly as fast as I had ever seen it run. It would boot and be usable in less than 30 seconds and most applications would literally open within a second or two (though Firefox took a little longer at about 10-11 seconds to be usable), if not instantaneously. So now I'm confused. I am running OS X and Windows XP on the same piece of hardware. If the Mac OS is indeed faster as claimed why is it running so much slower on my computer? I don't have that much software installed on my Mac (remember I ran out of disk space pretty quickly so I didn't even have the opportunity to install that much), and nothing I have installed should be slowing it down—no antivirus or firewall or similar software getting in the way of performance on the Mac side of things. The only answer I can come up with is that OS X is just slower than XP. Mac owners feel free to chime in here… this is a legitimate question; I am honestly perplexed why there is such a drastic difference—is there something wrong with my OS X?

Update: I updated to Windows Vista under BootCamp recently, and it runs very well on the Mac Mini as well. It is actually still quite a bit faster than OS X on the same box.

Not for keyboard users

While I am on the subject of speed, I also find that, because of its design and not its (apparently slow) implementation that using the Mac really slows me down. I am one who uses the keyboard a lot to control my computer—the Windows key and Alternate keys on my keyboard are two of my best friends—and the Mac doesn't do a very good job of providing keyboard shortcuts for navigation around the machine. I couldn't find, for example, a key to send focus up to the menu bar to allow me to select commands with the keyboard. I have become very accustomed to navigating menus on my Windows computers using just the keyboard, and it saves me a lot of time over having to move my hands back and forth between the keyboard and the mouse. Having to always use the mouse for accessing commands makes me a lot less productive.

User Interface (continued)

And one other item that bugs me about the Mac is the way that the menu bars themselves are setup. In Windows each window or program has its own menu bar that is always visible. On the Mac, there is one menu bar that always stays at the top of the screen, and it changes to reflect which application's window is active. This reduces screen clutter a little at the expense of productivity. When I am using two programs simultaneously (such as a web browser and a word processor, where I am copying and pasting between the two) things take twice as long to get done on the Mac because in order to access the menu bar for a given application I have to click on one of its windows first to reveal that application's menu bar (occasionally having to wait for a while for the computer to respond to the change), then I can use the mouse to select the menu command I want. Then when I want a menu command from the other application, I have to click on a window for that application, wait for its menu bar, move the mouse back up to the top of the screen to access the menus, etc. On a Windows PC all menu bars are always available as long as its window is visible, and since I can access menu bar commands with the keyboard, very often my hands never have to leave the keyboard and I can switch between the two programs very quickly, and can usually switch every couple of seconds. I realize that not everyone will do this, but power users may feel as though their work is hindered by the extra steps that are part of the process.

It "just works?"

I often hear Mac owners brag about how hardware devices "just work" with their computers without any difficulty. This may be true for some devices, but my own experience has been quite a bit different. Apple has placed a fair amount of emphasis with its driver development teams to create generic drivers for digital cameras, MP3 players, and some other popular devices, and for devices which they support it does tend to work pretty easily. But I have had significant problems getting my Mac to talk to my cell phone for mobile Internet access – in fact it doesn't appear to be possible, while my PCs and my PDA all made it very simple to setup and get working.

OS X only comes with drivers for one of my four printers, and even that one works with greatly reduced functionality. Fortunately I was able to locate a driver for my color laser from the manufacturer's web site but it was somewhat difficult to install and get working because the printer uses an Ethernet connection rather than USB so it wasn't something that the Mac could automatically detect and get working. The remaining two printers aren't supported by Apple or the manufacturers, so I guess I am out of luck with those. My Mac also absolutely refuses to complete the Bluetooth pairing process with my headset; it just sits there doing nothing. Just to make sure it wasn't taking a long time I let it sit for 48 hours with no progress whatsoever, requiring a forced power off of the computer to end the process. So despite what Apple and many Mac owners will tell you, it doesn't always "just work." For someone looking at buying a Mac, make sure there is support for the hardware that you will need to connect.


We've probably all seen the television ad where "PC" freezes and requires a reboot. The implication is, of course, that OS X doesn't ever freeze and doesn't need to be restarted. My own experience confirms this; you don't need to restart Macs very often. Apple tends to use quality hardware that is stable. But then again, I never have to restart any of my 7 PCs either, other than for security updates. But OS X also has security updates that require a restart about once a month, just like Windows. They must be comparing Macs to PCs with really cheap hardware that isn't stable (you do get what you pay for with PC hardware). If you ran OS X on cheap hardware it would crash too. I learned my lesson about cheap hardware a while ago so now I use only quality components, and my computers just don't crash. As of this writing my two desktop computers have each been running for over 30 days without a restart. I reboot my Mac at least as much as my PCs, despite the fact that I use it a lot less.


Security is a big issue that Mac owners like to bring up when talking to Windows users. Their claim is that the Mac OS is more secure and that you don't need antivirus, antispyware, and other such software on their computers. It is definitely true that there are a lot fewer Mac viruses out there – to the point that most Mac owners haven't ever encountered one. So indeed there may be something to those claims. What scares me, though, is that this rumor has been misconstrued to the point where many believe that OS X itself is not vulnerable to viruses, so no precautions need to be taken. So what you end up with is a false sense of security, because most viruses and spyware don't rely on holes in an operating system to perpetuate themselves (worms are another matter); they use what is called social engineering to spread and do their damage – they trick you into installing them on your computer. Anyone who wants to could easily write a virus for a Mac and because most Mac owners do nothing to protect themselves, if a virus does get out it could turn into something really nasty. Most Windows users, on the other hand, are aware that precautions must be taken, and most of those users purchase the appropriate software to protect themselves. For the time being, since Mac viruses are so rare, everything is fine in Macland. My fear is that someday things may change and those who believe the Mac OS to be inherently more secure may be in for a rude awakening. It is just a matter of time. Windows has been a victim of its own success; should OS X ever catch on at the same level it will have the same problems.

Other Issues for Switchers

There are a few other fundamental differences between the Mac OS and Windows that those who switch from one to the other will need to keep in mind until they become routine. For example, most Mac applications don't close automatically when you close the last window; you need to use the Quit command in the File menu to close them down completely. And when you do this make sure you remember to Quit the application right away; otherwise you probably won't be aware it is running in the background potentially slowing you down. On Windows the default behavior is usually the opposite; most applications close completely when you close the window. For those who are learning the Mac I suggest getting used to using the File / Quit all of the time instead of closing windows, just so you know that your apps are actually closed.

Another thing that still frustrates me is that on Macs there is no hard drive activity light or other visual indicator that the computer might be busy doing something. I couldn't tell you how many times I have double-clicked an application to open it, thought nothing was happening (remember that slow response time with no on-screen indication that the application is loading), started to double-click again only to find that the application had been loading and I just couldn't tell. And I wasn't ever able to figure out a way to make my Mac reconnect network connections to my server at startup; I still have to manually reconnect to get access to my documents and other files (fortunately you can tell the Mac to remember the network address so at least I don't have to retype it each time). And if you have an Exchange email server like I do, configuring the included Mac applications to talk to it is deceptively difficult and time consuming despite the fact that these applications appear to support Exchange connections.

Running PC Software

With the availability of Bootcamp and Parallels (and even Crossover), Mac hardware is able to run Windows and therefore Windows applications. I don't think I would call this ideal for several reasons which differ based on which of the above solutions you choose to run. Bootcamp, for example, currently doesn't provide drivers for all of the Apple hardware and requires you to restart your computer in order to use a Windows application, so you can't run Mac OS X and Windows simultaneously. Parallels addresses this problem by running Windows in a window on your Mac OS X desktop, at the expense of requiring a great deal more memory on your computer or performance in both OS X and Windows really suffers. And once you do have enough memory, Windows applications don't quite run at full speed, and OS X takes a small performance hit as well. Crossover would address both of those issues, because it attempts to run Windows applications under OS X through an emulation and translation layer, but it really doesn't work very well for anything other than a handful of software applications. And for both Bootcamp and Parallels you are required to purchase a license for Windows (XP Home is $179.99 and XP Professional is $269.99 at in addition to the $80 price of Parallels if you choose to go that route. And if you are running any Windows applications on your Mac you will probably want to invest in a real PC mouse since the only decent Mac mouse only doesn't have a right button. So, yes, you can run PC software on your Mac, but it is going to cost you, either by making your wallet lighter or in machine performance.

In Conclusion

The overall feeling I get from my Mac is that it is telling me, "You aren't smart enough to use a computer. Here, I know what you want to do, so let me do it for you instead," when for the most part the assumptions are wrong and it just takes me longer to do what I really wanted in the first place. Apple's OS X is fine for some people, but it certainly isn't right for me as a primary machine. Because of the way it is designed it just gets in my way, isn't nearly as customizable, and well, just doesn't have as much software available for it as Windows does, much of which I use as part of my daily routine.

I don't want to be too negative about OS X, though, because it is a nice operating system. It looks nice, Apple's software is more consistent in the way that it works than Windows is, and it is pretty stable, rarely requiring a reboot. And Apple always has nice hardware designs; you'll never have to hide a Mac because it's 'ugly.' And in the past Apple has used high quality hardware to build their machines (though this is changing to some degree since the move to the Intel platform).

So now when people ask me if I think a Mac would be good for them I have to ask them what they will be doing with their computer. So if you were to ask me now, I'd have to say that if computing needs are modest—if you just browse the web, check email, listen to music, copy pictures from a digital camera, and don't share files with other people, the Mac OS may be just fine for you. But if you are like me and are a computer power user, you may too find that OS X is just too limiting for the types of things that you might want to do. My gut feeling here is that the approximately 5% market share that OS X occupies isn't far off from what percentage of the population OS X is right for as it exists right now (especially when the higher cost of Mac hardware is taken into consideration). So if you are among that 95% that needs more from their computer or doesn't want to pay too much for it, something other than OS X may be a better choice.


Sony Playstation 3

Well my brother Brent finally broke down and bought a PlayStation 3. I haven't spent tons of time with it, but I have kept my eyes wide open while Brent has been using it and have been able to make a few observations, and make a few comparisons to my Xbox 360.


  • The graphics are very detailed and from the games that Brent has purchased and the demos he has downloaded, frame rates are high. It doesn't look any better than the Xbox 360 at this point, though. In fact games look worse because…
  • The games don't take advantage of anti-aliasing, so diagonal lines are jagged and objects with detailed textures tend to flicker a lot, especially when they are far away in the background. I believe this is a hardware limitation from what I have read online, so it isn't going away. The PlayStation 2 had a problem with this, too. Fortunately because PS3 games are in high definition, the flickering occurs in much smaller pixels so it isn't quite as annoying, but I still find it distracting. (I can't watch many PS2 games because of the alias flickering, PS3 games are at least tolerable.) The Xbox 360 does not have this problem in any of the games I have seen.
  • The PS3 does NOT up-convert ANYTHING, including PS2 games. And games designed for 720P have to be down-converted to 480P if your TV doesn't support the 720P format. (The Xbox 360 up-converts all games, including original Xbox titles to the highest format your TV/monitor supports.)


  • Blu-ray has the potential to look great on the PS3. Unfortunately it is only lately where movie transfers have started to be okay. The movie selection, in my opinion, is still poor, slightly worse than HD-DVD. Using the PS3 controller to control the Blu-ray player on the PS3 is kind of annoying and unnatural. If you intend to play movies on the PS3 invest in the remote control (especially since the PS3 has no infrared port, so no universal remote on this planet will work).
  • Getting video and music on to the PS3 is annoying. Since it can't be streamed off of a networked PC you have to copy it over manually and store it on the PS3. One feature that it does have over the Xbox 360 is that you can store your ripped CDs on an external hard drive from the PS3 itself (the Xbox requires that you do this from a PC), but it will not pick up on file tags in AAC files created by iTunes, so you have to re-tag all of your music on the PS3, something to not even consider without a USB keyboard. Brent has spent hours retagging his music, which is something I wouldn't have done, that's for sure. Video formats appear to be limited to MPEG-4 with some pretty strict requirements on what it will take, but a freeware program is available to transcode your existing video files to something the PS3 will accept. The Xbox 360 has a HUGE advantage here; I had access to all 16,000 of my music files within a couple minutes over the network, all of the files properly tagged, and I didn't have to buy another external hard drive to store it all.
  • Connecting an iPod is virtually worthless. You can see the file system of the iPod, and it will play files, but because the iPod file structure is essentially random, and the PS3 doesn't support the tags used in these files, you will never find anything you want to hear.


  • The Sony store from within the PS3 isn't very good. One example of this is that we can't figure out how to move back a level after selecting a game to view its details. To really use the store effectively you need to plug in a mouse (the on-screen controls aren't very game controller friendly), but the PS3 doesn't support Bluetooth mice so you're stuck with a cord or USB wireless mouse. Who wants to sit that close to an HDTV? (Update: a recent firmware update for the PS3 added support for Bluetooth mice.)
  • If you download a trial game and decide you want to purchase the full version, you have to download a new version of the game instead of just unlocking it. The Xbox 360's system of just unlocking the demo you already have is much better.

PS2 Compatibility

  • PS2 games tend to look worse on the PS3 than they did on the PS2, with a few exceptions. Textures are rendered at lower resolution (who knows why?) in most titles, and the PS3 won't let you play some games that support widescreen in true widescreen format. It appears to not support the widescreen flag that can be set in a video stream to tell the TV to go to widescreen mode so even if you do find a game that can run widescreen on the PS3 you have to set your TV to widescreen manually. Considering how most PS3 owners are hooking them up to (widescreen) HDTVs, Sony really ought to figure out how to get the PS2 widescreen titles to play widescreen on the PS3.
  • One little annoying quirk I just don't understand: despite the fact that you have controllers paired, online, and working, each time you start a PS2 title you have to press the PS button on each controller to re-connect.
  • I do like the fact that you can buy an inexpensive adapter to copy your PS2 game saves over to the PS3. But I don't like they way they do it with virtual memory cards, because you have to keep track of which virtual card has the game save for each of the different PS2 games that you play.

Hardware - General

  • The PS3 is BIG and it gets HOT! I wouldn't put it in a closed cabinet unless I was trying to play a game of Self Destruction. And to preserve its life, buy a small fan to blow on it.
  • The Sixaxis controllers feel like a significant step back in quality compared to the DualShock controllers for the PS2. They are very lightweight and don't feel like they are very well built. The first day Brent had his PS3 the controllers were already showing signs of wear; I don't expect a controller to last very long. And when they are twisted (which happens when you are playing), they can creak and moan, a sure sign they aren't built very well. Oh, and since the rechargeable battery is built in and can't be replaced, if your battery dies in the middle of a game you have to plug it in via USB using a very short cord, placing you right back in front of your HDTV again. The motion/tilt feature is a gimmick and doesn't add anything to the games I've seen that use it. And I missed the rumble.
  • The PS3 controller can be used on a PC via USB by downloading a third party driver off of the Internet. Without the driver it appears that it will work until it doesn't actually work. We didn't test if it could be used via Bluetooth.


  • The game selection isn't very good yet. And some of the titles available are bad ports from other systems. In fact, at least one game that Brent rented for the PS3 is much better on the PS2. The PS3 still doesn't have a game that 'wows' me. (Motorstorm looks good but isn't groundbreaking.)
  • There aren't going to be as many PS3 exclusives as there were PS2 exclusives. Many of the franchises that were exclusive to the PS2 are being developed for multiple platforms, or are not exclusive on other consoles.

Other Stuff

  • The PS3's on-screen keyboard is confusing and awkward… just plain terrible! If you buy a PS3, buy a USB keyboard, even if just for the online account signup process.
  • I can't comment on sound quality for movies or games; Brent's PS3 isn't connected to a good sound system.


The only thing I see in the PS3 as an advantage over the Xbox 360 is the Blu-ray drive. (And speaking of high definition movie formats, my personal belief is that neither Blu-ray nor HD-DVD will catch on very well until the players are cheap, or the HD movies come with SD versions that you can play on your existing DVD players like laptops, in the car, or on other TVs in your house.) Some may argue that the PS3 is a good choice for those that have a large PS2 game collection, but honestly the PS2 makes a much better box for playing PS2 games.

To be completely honest, Xbox 360 games looked better at this point in its life, and look a lot better now despite the hype of the PS3 being a more powerful machine (which is a point of debate anyway). The 360's online service is unmatched, especially now that movies and TV shows are available for download. I like the feel and build quality of the 360's controllers a lot better than Sony's as well. I suspect the PS3 will be considered a success, but probably not at the rate that Sony and PS2 fanboys hope that it will; for now it is too expensive without enough compelling games or features to justify its high price. And as of this writing the 360 is outselling the PS3, and if that trend continues the PS3 obviously won't be able to catch up or surpass the 360.


Google Search