Showing posts with label computers. Show all posts
Showing posts with label computers. Show all posts

Tuesday, October 14, 2014

Why Web Sites Don’t Need to Store Your Password

It seems counterintuitive, but web sites that require logins don’t actually need to store your password.  And they actually shouldn’t – it is a very bad idea to do so.   We see too many leaks of account databases for it to ever be safe to store passwords in any form, even if encrypted.

So how can a site validate a login if it doesn’t store the password?  The answer is something really cool called a hash function.  I know your eyes just glazed over, but bear with me, the concept is actually simple.

A hash function is a way of processing data that is one-way… you can put data in, and always get the same result coming out, but there is no way to reverse the process to get back the original data.  I won’t get into the specifics of how hashes actually work, but I can describe a very simple hash that will illustrate the principle.

Say, for the sake of simplicity, we are creating a web site that uses a 4-digit PIN as a password to log in.  We know that storing the PIN itself is dangerous because it could be leaked out or viewed by site administrators, so instead we add up the four digits and store that sum.  So if my PIN is 2468, we store 20 (2+4+6+8) in the database.  When we go back to the site to log in, the site can add up the four digits we enter for the PIN, compare that result against the sum in the database, and validate that we know what the correct PIN number is.  A hacker that gets his hands on the database only knows that the sum of the digits is 20… he can’t possibly know that the original PIN was 2468.  They’d have to guess what the original PIN number was by trying different combinations.

Of course this is overly simplified.  This demonstration hash function wouldn’t really work in the real world because it is too easy to figure out combinations that would let hackers in.  This situation is called a collision… 8642, 5555, 8282, 1991, and 6446 all produce the same hash value of 20.  But real hash functions used for account login verification are much, much more complicated, and aren’t normally subject to problems with collisions.  But you get the idea.  Instead of storing the actual password, we store a value that is calculated from the password.  We can validate that someone knows the password without actually storing that password.

This has other advantages as well.  For example, using a hash function there is no limit to the length of the password, because hash result values are always the same length regardless of the amount of data going in.  Someone could enter 6 letters, or 200 random symbols, and either one can be hashed down to a value of a standard length that can be stored in the database. 

Because of this, you can sometimes tell web sites that don’t use hashes to securely store passwords because they enforce a maximum length for passwords.  This isn’t always the case, but it can be one indicator that the site’s security has been poorly designed.  But if you are signing up for an account on a web site and they have a low limit on the length of the password, like 12 characters, you might look for other signs of poor site security or privacy policies.  And definitely don’t reuse a password from another site.  Or just steer clear.

The down side to using hashes is that if you forget your password the site has no way of sending it to you… because they actually don’t know it.  That is why sites generate a brand-new, random passwords that they send to you via email when you forget your password.  They honestly have no idea what your password was, so the only solution is to create a new one and use that temporarily until you create your own.

The whole process is considerably more complicated than I’ve described here – or at least it should be.  Just using a hash isn’t sufficient, either, because we’ve got affordable computers these days that can calculate billions of hashes per second and are therefore capable of brute-forcing short passwords very quickly.  (A 6-letter password, for example, would be cracked hundreds of times over in just one second using a simple hash).  But for a site to use a hash on passwords is one step in the right direction.

Thursday, August 22, 2013

Software Development: Old School or New School?

Since I started writing software when I was 5, I’ve been doing it a long time.  I’ve seen a lot of changes in the technology – from the BASIC language all the way to assembly, desktop to server, fat client to thin client, you name it.  But the trend I’ve seen over the last 10-15 years is troubling.

There is absolutely no question that the Internet has changed things radically.  Much of that change is good.  There is, however, an aspect of the Internet and the way that software is written that is disturbing.  Many of the time-tested, well-thought-out, efficient ways of coding are disappearing and are being replaced by junky, ill-conceived, incredibly inefficient substitutes.  People that are learning to code now are mostly coding for the web, and it is very upsetting how little they understand of the actual science behind computing, mostly because the software development tools in use today don’t even support the best, time-tested concepts.

As part of my job I do software development in both Delphi (a modern-day variant of Pascal, very similar to Microsoft’s C#) and PHP.  Delphi is extremely efficient, and has adopted most of the best ideas that have ever come along in computing.  PHP is at the opposite end of the spectrum – extremely inefficient, lacking support for many of the most basic tools that real high level languages offer.  If you start to talk about JavaScript (the programming language that powers web browsers) the situation is even more dire – it is far more basic than even PHP.  Yet nearly all of the hype you hear in development is around HTML 5, JavaScript, Java, and PHP.  All of which are, frankly, very immature, and are evolving at glacial pace.

One of the technologies that is falling by the wayside is object-oriented programming.  It allows developers to create virtual objects that you can copy, manipulate, and act upon extremely easily and efficiently.  Java is object oriented, but it has other problems of its own (efficiency and security being the main two) that are causing it to fall out of favor quite rapidly.  PHP has some support for objects, but frankly it’s pretty terrible.  HTML and JavaScript don’t even attempt to support it at all.  People that are learning to program now don’t seem to have any kind of understanding of how much easier their lives would be if they had access to object-oriented development tools.  And the situation is actually getting worse, not better.

Another concept that is lost on the web is code compilation.  Pretty much ever since the dawn of computing, developers take code and run it through a compiler to produce the set of instructions that are native to their computer so that they don’t have to be translated at the time the software is run.  Consider how much more efficient you are at speaking your own language than you would be at trying to converse in Korean by using a dictionary, having never heard or seen a word of Korean before.  Compiling does the translation ahead of time (just once) so that software runs as quickly as possible.  Yet again, web technologies don’t do compilation – they do the “translation” at the time that code is executed, making things incredibly slow in comparison.  In addition to that, since the translation is done at run-time, that means you have to distribute the actual source code (the original code you’ve written) to your software in order to run it… so anybody who wants to could take your code and modify and redistribute it… or in cases where you’ve got content you want to protect, like music, or a movie, everybody can see exactly how it is protected so that protection can be removed.  Java has the ability to do a sort of rudimentary compilation just before code is executed, but it is still far from true native code, and it still slows you down considerably.

It’s almost like about 15 years ago people said, “We don’t care about all of the research and learning that has occurred over the last 50 years.  We’re going to come up with a new way of doing things, no matter how good some of your ideas may be.”

As someone who works in both worlds it is incredibly frustrating.  Especially when I have to interact with people who have only ever spent time in the newer web technologies, because they don’t even have a remote concept of what they are missing out on.

There are a ton of other great technologies that seem to be falling by the wayside.  True code debugging (the ability to see what is happening internally inside of software as it is running, making testing much, much easier) is extremely rare.  RAD (Rapid Application Development), once considered the epitome of efficient design and coding, is almost unheard of today.  True integration with databases is pretty much gone too, and in its place are incredibly difficult-to-program, very bloated communication methods that making coding difficult, especially if it is to be done securely.  Forgive me if fname.value=’Frank’ is easier (and conceptually much more sound) than “UPDATE users SET fname=’Frank’ WHERE userid=56”, but this is exactly the sort of difference I’m talking about.  For the most part web developers aren’t even remotely aware that the tools we had for doing things were much better than the best of what they have access to today.  It’s really quite sad.

I’m not saying for a minute that these newer tools don’t have a place.  They do.  But very little, if anything, is being done to improve the tools and incorporate the lessons that 70 years of computing science have taught us.  There’s almost a wall there where anyone who works in the newer tools will automatically dismiss ideas from the old school just because they are old school, not because there is any real reason to do so.

So I have to admit that I don’t really having to work with HTML and JavaScript and PHP.  They all seem incredibly antiquated to me.  Almost like I’m stepping back in time 30 years.  In many cases it is much harder to do things in the “modern” tools than it was in the contemporary tools of the early 1980s.  Things that I’ve taken for granted in what I would call a “real” development environment just don’t even exist when working with their “modern” counterparts. 

Would you enjoy having your Ferrari swapped out for a Model T?  And somehow I’m expected to like it.

The result of all of the backwards ways of doing things with “modern” tools is that it takes forever to get anything done.  I can easily write “equivalent” code in Delphi five times faster than it can be done in PHP even though at this point I probably know PHP as well as anyone could.  And, on average, it takes about half of the lines of code in Delphi to accomplish something as it does to do the same thing in PHP.  And yet the Delphi code literally executes more than a hundred times faster, and provides a better user experience.  Yet somehow people are critical of my decision to continue to use such a tool.  Only because they don’t understand it, and in most cases refuse to even try.

Much of the stagnation in web technologies is due to the bickering and in-fighting that happens between companies that build tools for the web.  HTML 5 is, in reality, very poorly suited for what we are asking it to do today.  And everybody involved wants their own ideas for improving it to become the standard, but nobody else is willing to adopt those ideas because they aren’t their own and they can’t profit from it.  In the 1990s and early 2000s, for example, Microsoft tried to extend HTML 5 with new features in Internet Explorer and they got shot down by everyone else, because they weren’t “following the standard.”  Well, yeah, they didn’t… because there wasn’t a way of doing the things they wanted to in the standard.  Yet when people do actually get together to try to improve the standard, nobody can agree on anything so nothing gets done.  We’ve been talking about HTML 5 for nearly ten years, and it is still so poorly supported across different browsers that you almost can’t use it.

Trying to creating interactive web pages is a an absolute disaster – programmers have to take care of every low-level event (click button, move mouse, release), and those events differ from browser to browser.  Want to play music or show video on a web page?  Nobody can even agree on how to do that so you have to produce three separate versions of every file, then figure out which version to use when you view the page.  HTML wasn’t ever even designed to handle any multimedia other than graphics, either, which is why so many web pages use Adobe Flash, despite the fact that everybody hates it.  Want to do things like drag-and-drop?  Good luck.  It’s really hard to do, and usually has to be coded multiple different ways to work in all popular browsers.  But in my ‘old school’ Delphi drag and drop doesn’t even require writing a single line of code.  Just set an object property saying ‘yes, you can be dragged’ and ‘you can accept dragged objects.’

Adding database interactivity to a web page is an exercise in patience and frustration.  There still isn’t an official way for a web page to pull data from (or insert data into) a database.  It’s still a very tedious and time consuming thing to do.  Don’t even get me started on how nobody does it securely because that’s even harder to do.  But we’ve had databases for 50 years so basic interactions like this should be a cakewalk.  In Delphi, all I have to do to retrieve record 56 from the users table of the database is users.FindKey([56]).  The same thing in PHP is at a minimum of 4 lines of code – much more if you do proper error checking.  And in JavaScript?  Well, don’t plan on working on anything else that afternoon.

It goes on and on.  Want a web page to interact with a Joystick on the web?  Not happening.  Or generating output for a printer with full control over how it looks?  Again, not really possible.  How about photo editing?  Not very plausible in HTML.  How about a page that uploads a picture to your cell phone over USB?  Nope, HTML doesn’t allow it.  And it will likely be at least a decade before such things are actually possible and usable.

All of the above problems had already been pretty much solved by traditional development tools long ago. 

And somehow many of the companies that have produced the strongest tools and environments for software development in the past are abandoning the more mature technologies.  Microsoft is trying to force everybody to the write Windows 8 apps, despite the fact that this environment, too, is missing some of the best things from their traditional desktop environment.  Apple invests very little in desktop technologies.  And Linux stagnated years ago.

It’s really pretty sad.  If people were smart they’d take the best ideas from wherever they come from instead of trying to reinvent the wheel over and over.  And as it stands today, the technologies that power the web – HTML, JavaScript, etc. – are more of a wooden, square wheel than most developers realize.  The traditional ways of doing things don’t have to be left behind – they could easily handle the same tasks that the newer technologies are doing, and in most cases do a far better job of it.  Or, some of the concepts from traditional development could be added to the newer tools.  But, for some reason, never the twain shall meet.  It’s frustrating having to choose between high functionality, quick development, and high performance, and working on the Internet.  It would be really nice to be able to do both.

Monday, October 1, 2012

Should I Wait for Windows 8?

Window 8 becomes available in retail in just a little less than 4 weeks.  With computers still shipping with Windows 7 until then, and most vendors not automatically covering the $15 upgrade charge, does it make sense to buy a Windows 7 computer now, or wait a few weeks and get on with Windows 8 from the factory?

I’ve been using Windows 8 a bit here and there since the first preview release nearly a year ago, and I was given access to the final release version of Windows 8 back in August.  So I’ve had a little time with it now and have had a chance to formulate an opinion on it based on actual hands-on time rather than just by reading articles on the Internet.  I haven’t used it as my primary operating system, but I have spent quite a few hours with it.

So instead of making you read a long drawn-out article that covers every little change that has been made, let’s just get down to brass tacks.  Should you wait for a computer running Windows 8?  Let me answer that question with two of my own: Does the computer you are looking at buying have a touch screen?  And would you be happy running a tablet-style interface?  If the answer to either of these is “No” then sticking with Window 7 is likely your better option at this point in time.

But isn’t Windows 8 supposed to be the latest and greatest version of Windows?  Isn’t Microsoft betting the farm on it?  Yes, and yes.  And while they changes they have made give them the opportunity to provide the best experience on a tablet device, they’ve really sacrificed ease-of-use on computers that are still going to be used primarily with a mouse and keyboard.  The user interface they’ve created is just awkward with traditional input devices, even if it is very well designed for touch-friendly devices.

For those not familiar with what I’m talking about, Windows 8 completely ditches the Start menu that we’ve become used to since it first appeared in Windows 95 just over 17 years ago.  If you’re used to launching your software from the Start menu, you’re in for a real shock as you discover that your precious Start menu is completely gone, being replaced by an entire Start Screen with very large tiles to start your applications.  Even on a large, high-resolution monitor, you’ll only see a few dozen tiles at best.  On a smaller screen, you’ll have to scroll horizontally to find anything that doesn’t in the initial view.  And scrolling is kind of a problem… the only way to scroll efficiently on the keyboard is with the Page Up/Page Down keys (which many laptops have now abandoned), and the mouse’s scroll wheel doesn’t scroll horizontally either.  So you have to use the scrollbar at the bottom of the screen, which is a little awkward. 

Once you’ve started a traditional Windows app, the way to get back to the Start screen to launch another just isn’t apparent.  There is absolutely no visual indication on-screen for how to get back.  Only if you know to move the mouse down to the very bottom left corner of the screen can you figure out how to get back to the Start Screen from the desktop.  It’s mind boggling to me that something so necessary to efficiently use the computer has no button or other visible way on-screen to get to it.  You’ll get used to it, but it seems weird to me that Microsoft didn’t provide even a single button to navigate to the most important part of its interface.  Odd choice.

The good news is that once you’ve gotten used to the strange new interface, that Windows 8 is very fast.  There is as much of a speed improvement going to Windows 8 from Windows 7 as there was going from Vista to 7.  Yep.  It’s just that much faster.  One one of my computers, running an SSD, I was able to get Windows 8 to boot in under two seconds.  I’m not talking about waking from a sleep mode of some kind, I’m talking about a full reboot.  Once the computer got past its system check screens, and the Windows 8 logo first appeared, the login screen was visible and fully usable in under two seconds.  Most computers won’t see that kind of performance, but boot times in less than 10 seconds will be common.

Microsoft has also done a great deal to speed up performance in third party software as well.  They’ve completely revamped all of the graphics code, so everything draws on-screen much faster than it has in the past.  They’ve also done a lot of work to temporarily shut down (or at least pause) programs that run in the background so they don’t slow you down in the software that you’re actually using.  They’ve also dramatically cut back on the number of programs that have to run on the computer in the background for Windows to provide all of its standard functionality… there has been a lot of simplification and consolidation to make sure that everything you need is still there, but that it runs more efficiently.  As a result, your computer will run faster under Windows 8 than it ever has before, and that computers will perform better with less memory (RAM) than they have in the past.

The other nice thing that Microsoft has done is to drastically reduce the price of Windows 8 as compared to previous versions.  If you already have a computer running Windows XP, Windows Vista, or Windows 7, the upgrade price is just $40.  If you purchase(d) a computer after June of this year, the upgrade is $15.  So if you do decide to go with Windows 8, at least it won’t cost you that much.

Bottom Line

So bottom line is… if you’re comfortable with Windows 7 and don’t want to struggle with a completely new user interface, and you aren’t going to be running it on a tablet anyway… and the computer you’re looking at buying is already plenty fast, I’d skip Windows 8… at least for now.  You could always pay the $15 to buy a license for it, but not actually install it just yet. 

As for me, I’ll be keeping one computer around with Windows 8 so I can test my software on it, but other than that I don’t plan to upgrade any of them, and I don’t have any plans to buy a computer with Windows 8 on it.

The other thing to be aware of is the new Windows RT tablets that will be available at Windows 8 launch.  It’s important to know that Windows RT is not Windows 8, and these devices cannot run traditional Windows software.  They can ONLY run Windows RT apps (sometimes also called Windows 8-style apps), so you’re talking about a completely new investment in software, very little of which will be available for a little while yet.  Try thinking of Windows RT as “Not Windows” because it doesn’t even remotely resemble the Windows you are used to.  The software you already own won’t work on it no matter how hard you try.  If you need to run Windows software on a new tablet/computer, Windows 7/8 are your ONLY options.

Saturday, May 7, 2011

Tech Tip: Extra Life From Your Old Computer

One of the things I did this week was try to upgrade my netbook computer with an SSD (Solid State Disk) drive to make it faster and more bearable to use (more on what that is in a minute… bear with me). It’s always been kind of slow, and I figured if I could put $100 into it instead of buying a whole new one, that would be a good thing, right?  Well, that didn’t work out so well… performance with the SSD was actually far worse than it was with the hard drive that was in it, and the “fix” to make it work right just ended up not being worth it… so I had an SSD without a home.  After playing around with a couple other ideas, I decided to put it into an old Toshiba laptop I’ve had for a little over 3 years because it has always felt a little slow.  And boy, what a difference it made.

An SSD is a storage device that acts like a hard disk drive, but uses memory chips instead of a spinning platter to store data.  Since there are no moving parts, they are very fast.  They’ve been prohibitively expensive until fairly recently (and it’s still expensive to get something with a lot of storage capacity) but they’re finally in the realm of being affordable for the masses as long as your storage needs aren’t extreme.  The SSD I bought was an OCZ Vertex 2 60GB model which I picked up on sale for just over $100.  Since I don’t store music or movies on that laptop, this was plenty large enough.  Windows 7, Microsoft Office, and Photoshop take up around 20 GB total, which gives me plenty of room to spare for anything else I might need to put on it.  The difference in performance was enormous!

If you can work a screwdriver you can install an SSD drive in your computer.  The physical installation is very easy.  The only part that might get a bit tricky is getting Windows installed onto it.  If your computer came with a Restore DVD, or you have an original Windows installation DVD, setting it up is a piece of cake.  If it didn’t, you may want to invest in a data transfer kit (this one is my favorite).

Prior to installing the SSD, my Toshiba laptop would take about 60-75 seconds to boot.  With the SSD it takes about 13 seconds.  That’s logo screen to usable desktop, folks.  While it previously took about 5-10 seconds to load Microsoft Word on the hard drive, it now loads in less than 1 second on the SSD.  Photoshop loads in 6 seconds instead of 40, and web browsers come up instantly.  Launching most programs occurs almost instantaneously.  As I was installing Windows updates (I started with a fresh copy of Windows), I was amazed to see the majority of them install about one per second instead of watching the minutes tick by.  From start to finish (empty drive to installing Windows to installing all available updates) it only took about 30 minutes to do everything.  And this computer is SO fast now… even though it’s over 3 years old and wasn’t that much to shout about when it was new.

Swapping out a hard drive for an SSD isn’t the only easy and relatively inexpensive thing you can do to speed up an aging computer.  Upgrading the memory is also very easy and doesn’t cost that much (the Crucial web site has a scanner that can tell you what type of memory your computer needs).  I upgraded my Toshiba laptop to 4GB of RAM for $35 a couple months ago, and recently upgraded a different laptop to 8GB of RAM for $85.  The desktop computer I built last month got 8GB of RAM for about $80 as well.  If you’re running a computer with just 1 or 2GB of RAM, it’s time to upgrade.  The performance difference can be pretty dramatic.  Not quite as drastic as replacing a hard drive with an SSD, but still quite noticeable.

So how do you know if your computer can be upgraded with an SSD? If it’s less than about 4 years old, the chances are very high.  The computer requires an SATA interface for the hard disk drive, which most computers made in the last 4 years are likely to have.  If you’re working with a desktop computer, you can probably buy a relatively small SSD for your operating system and programs, and use your existing hard drive as a secondary drive for storing your personal data.  That’s the route I’ve gone with the last two computers I’ve built, and I’ve been thrilled with the results.  As far as which model to get, the drives based on the SandForce controller chips currently yield the best performance (the OCZ Vertex 2 series give the best bang-for-the-buck and is available in 60GB, 120GB, and 240GB sizes.  For better performance at a higher cost, step up to the OCZ Vertex 3 series.)

So, long story short, if you’ve got an old computer that is just slower than you’d like and you don’t want to shell out a pile of money to buy a newer model, chances are you can swap out your hard drive with an SSD, and upgrade the RAM, not have it cost you that much, and you’ll end up with a computer that feels better than it did when you first pulled it out of the box.  It will actually feel much faster than a new computer unless the newer one happens to come with an SSD.

Installation of either the SSD or memory is pretty easy, but if you’ve got a hungry computer-savvy buddy, bake him or her a pie or plate of cookies to install yours for you.  You’ll be SO glad you upgraded.

Tip: SSDs perform best under Windows 7 (or the most recent versions of Linux).  Windows Vista, XP, and Mac OS X will run on SSDs, but they do not fully take advantage of the extra performance that SSDs offer.  These operating systems also suffer from a problem which causes writes to the disk to become incredibly slow after a period of time because they do not support a feature called TRIM.  This happens once the total amount of data written to the drive exceeds the total capacity of the drive.  This doesn’t mean you’ll see performance decrease when the drive is full, but after that much data has been written to the drive in total, whether you’re overwriting or deleting files or not.  Since the operating system itself writes to the disk a lot just as part of its normal operation (especially if you don’t have enough memory), you’ll probably hit this limitation a lot faster than you’d think you might.  Windows 7 knows how to properly communicate with the drive to let it know what parts of the drive are no longer being used, so it does not suffer from this problem.  If you’re running Vista or XP, you should also upgrade to Windows 7 if you’re going to run an SSD for the best results.  Mac users, you’re kind of out of luck... you’ll see amazing performance on your SSD for a while, then it will slow down drastically.  And there isn’t anything that can be done about it; it looks like not even the forthcoming OS X Lion upgrade is going to support TRIM unless you buy the computer with an SSD pre-installed by Apple.

Thursday, March 18, 2010

I Love My Computer

So I have been using the same computer for 5 years now.  I built it back in 2005 when dual core processors first became available for PCs.  So it was getting really old, and it was really getting in the way of me being efficient and effective in my work.

I decided last summer that I was going to build a new one, but I knew that one of the key parts I was looking at was very shortly due for an update, so I held off.  Then a couple months ago my company offered to pay for my new machine.  Finally, on Feb. 28th the part I was waiting for (the CPU) became available, so I ordered the parts and built it.

So, a quick rundown on what’s inside.

  • Intel Core i7 930 Processor at 2.8 GHz
  • 12GB GSkill DDR3-10666 RAM (6 x 2GB)
  • Intel X25-M 160GB Solid State Disk
  • 2 x Western Digital Caviar Black 640GB (in RAID-0)
  • 1 x Western Digital Caviar Black 1TB
  • NVIDIA GeForce GTS 250 Video
  • NVIDIA GeForce 8800 GT Video*
  • NVIDIA GeForce 8500 GT Video*
  • Windows 7 x64 Ultimate Edition

* Brought over from previous machine

Why 3 video cards?  I use 6 monitors, and each card drives two of them.

There are a few things that make this computer so awesome.  First, the 8 virtual cores (so it works sort of like 8 CPUs)…

image

image

Second, would be the amount of memory (12GB):

image

Third would be the Intel X25-M solid state disk (like a hard drive, but it uses flash memory instead of moving parts, so it is much faster):

image

What all of this means is:

  • The computer boots up in 14-16 seconds. (From Windows logo screen first appearing to all software being loaded and ready to use at the fully ready-to-use desktop.)
  • There is NO period of waiting for my 57 zillion startup applications to finish loading after a restart; they’re done loading before the desktop even appears.
  • Most software applications open virtually instantly:
    • Microsoft Office applications are totally finished loading in well under 1 second.
    • Photoshop CS3 loads in less than 2 seconds.
    • Premiere Pro CS3 loads in about 3 seconds.
    • Microsoft Outlook loads and is ready to use in less than 2 seconds.
    • Internet Explorer, Chrome, Opera, and Safari all load instantly.
    • Firefox loads in about 5 seconds.
    • My development environment (Delphi RAD Studio 2007) loads in about 15 seconds instead of 2 minutes.
    • The Zune software loads in about 3 seconds and is always snappy.
    • iTunes loads in about 1 minute instead of 5.
  • With this much memory, I can leave all of the software I use regularly running in the background; I never have to close anything if I don’t want to.
  • Web browsing is much more snappy, even without getting a faster Internet connection.
  • Editing standard definition video is actually very fast.
  • Editing high definition HDV video is not only possible, but it is easy.
  • When I’m programming and I pause to think, the development language I use only freezes for 2-3 seconds instead of 30-90 seconds. (I was literally losing hours of my time per week on this.)
  • Multitasking is seamless.  I can start a video render, minimize it, and not even feel the effect of it while using other applications.
  • Encoding DVDs into h.264 video happens at a rate of 140 frames per seconds (as opposed to 8 on my previous machine), so movies finish in about 40 minutes instead of 12 hours.
  • File transfers over my network run at 50 MB/sec instead of 7-8 MB/sec.

Total cost of hardware was just under $2,000.  My company covered $1500 of that.

This computer is just a pleasure to use.  I don’t have to wait for it to do anything. 

It hasn’t been without a few hiccups though…

  • My external MOTU 828mkII FireWire-based sound device (primarily designed for doing studio recordings) has some really buggy drivers.  Sound sometimes gets distorted.  This was a problem with the old computer, too.  It’s apparently universal, as others (both PC and Mac users) are having the same issue with this same device.  If it weren’t so expensive I’d trade it in.
  • Occasionally when I restart one of the video cards isn’t detected, so I have to restart again for it to come back.
  • I can’t get the fingerprint readers I use for developing my company’s Point-of-Sale software to work at all (no driver available).
  • It takes me days, if not weeks, to get all of my software re-installed.

But even with those small issues, I love this machine. 

And for anyone wondering why I didn’t get a Mac, I spec’d out a “roughly equivalent” machine (they don’t have an exact equivalent) and it would have been $7,173. While some areas would be better (CPU), it would still be lacking in a few areas, like memory and video performance.  Upgrading the video to something equivalent would add several hundred $ more.  If I were to switching to OS X, re-purchasing the software I use regularly from PC to Mac would have cost an additional $10,000+, and several most of the applications I use every day just aren’t even available at all for OS X.  Economically it just doesn’t make any sense whatsoever.

Thursday, October 29, 2009

Tech Tip: External Hard Disk Drives

If you’re thinking of buying an external hard disk drive for your computer, here are a few things to remember:

  • Of the different ways available to connect, USB is by far the slowest.  And it slows your computer down when it is being used.

    Every bit of data that goes to and from USB devices has to be handled by your computer’s CPU.  This not only makes data transfer to and from your external USB hard disk drive slow, but it also slows down the rest of your computer as well.  Other interfaces, like FireWire and eSATA, are able to transfer data directly to your computer’s memory without going through or waiting for the CPU, making them much faster.  eSATA will be the fastest, but not very many computers have eSATA ports on them yet (especially laptops).  FireWire is more common, but still not available on lower end computers.  Desktop computers can have an add-in FireWire or eSATA card installed relatively inexpensively.  Speed-wise, eSATA is faster than FireWire is faster than USB.
     
  • Generally speaking, bigger hard disk drives are faster than slower drives.

    The size of the actual platters containing your data remains the same, so bigger disks have to pack more data into a given area than smaller ones.  The more densely the data is packed, the more is read by the drive each time the platter spins a single rotation.
     
  • Generally speaking, bigger drives are more prone to failure.

    Because the data is packed more densely, the net loss of any portion of the disk going bad is much more noticeable.  The higher density is more sensitive to imperfections in the disk platter surface.  Bigger disks also tend to use newer, and thus less time-tested, technologies.
     
    I never buy the newest disk drives.  I always wait until a drive has matured before I will consider investing.
     
  • For USB and FireWire external hard disk drives, rotation speed doesn’t affect actual performance a whole lot.

    Manufacturers usually advertise the rotation speed of their drives, usually 5400 or 7200 RPM.  Just because a drive is 5400 RPM doesn’t necessarily mean it is going to be a lot slower than a 7200 RPM drive.  Especially if you are comparing a larger 5400 RPM drive to a smaller 7200 RPM drive (see the second principle, above.)
     
    For FireWire and USB drives, the performance bottleneck is the drive’s connection to your computer, not the speed of rotation.
     
  • As far as power consumption and heat issues go, 5400 RPM drives are a better choice than 7200 RPM.

    Drives that spin slower use less power and generate less heat.  And tend to last longer.
     
  • Brand name makes some difference, but outside of purchasing an external drive marketed by one of the major manufacturers, you never know what you’re getting.

    The major drive manufacturers (Seagate, Western Digital, Samsung, Hitachi, Fujitsu, Toshiba) market drives under their own names.  Other companies sell external hard disk drives, but usually use drive mechanisms from one of the big manufacturers, and there isn’t any way to know what brand of drive you’re actually getting.
     
    I have been buying Seagate drives for years without any issues, and that is what I generally will recommend. Hitachi drives have also been good for me.  My track record with Western Digital has been iffy.
     
  • External drives which plug directly into your network, allowing multiple computers to access their contents at once, do exist, and they can be convenient in some ways, but they can be difficult to setup, and they are going to be the slowest of any external drive solutions.  They are called NAS devices (Network Attached Storage), and they can be a little pricey.
     
  • The happy medium between price and storage size right now is 1 TB for 3.5” drive mechanisms, and 500 GB for 2.5” mechanisms.  Going much bigger than that generally demands a hefty price premium.
     
  • If a drive fails, you will lose everything on it.  So it might be better to have two smaller external drives than one huge drive.
     
  • Some manufacturers are offering great warranties (some Seagate drives have a 5 year warranty!), others are just 90 days.  Read the packaging closely.  Every hard disk drive is going to fail someday, and even the best won’t make it much beyond 3-5 years.  Having a good warranty will get you a replacement when yours dies.
     
  • Having a good warranty will allow you to get a complimentary replacement, but it won’t get your data back when a drive fails.  Always store multiple copies of your data in different places.

Generally speaking, for best performance get eSATA or FireWire before getting USB.  USB is available on nearly all computers, where FireWire and eSATA are not.  Check to see what ports your computer has.  Get a drive that is going to have sufficient storage for you for a few years, but don’t go excessively large.

Wednesday, August 5, 2009

Verizon MiFi = Freaking Awesome

I have been in Lubbock, TX this last week troubleshooting some computer problems in a pizza restaurant there.  It went very smoothly overall, and a lot of that has to do with a new toy/tool I picked up at the beginning of my trip… the Verizon MiFi2200 3G router.

I’m going to assume that anyone reading this isn’t already familiar with the device.  It is not only new, but a new kind of product we hadn’t even seen at all prior to about May of this year, so I can’t expect that anyone outside the most tech savvy people will be familiar with this type of device.  Essentially it is a wireless Wi-Fi router that uses Verizon’s 3G network to provide Internet access for up to 5 devices simultaneously.  Since it uses Wi-Fi it doesn’t require that any software be installed on the computers that connect.  It just shows up as a wireless network.  Any device that has Wi-Fi, whether it be a laptop computer, cell phone, Zune or other WiFi-enabled MP3 player like an iPod Touch can get on the Internet quickly by connecting to the MiFi’s Wi-Fi network.  With the ability to connect any 5 devices simultaneously, sharing Internet with family or friends becomes possible as well.

The device itself is very small.  Tiny, in fact.  It is exactly the same size as a stack of 8 credit cards, so it easily fits in a pocket.  Or easily left in a backpack without even noticing that it is there.  It’s quite light as well, weighing just a few ounces.

More than half of the volume of the device is for its battery.  And since it is battery powered, it doesn’t have to be plugged into anything at all for operation.  Just press the (only) button to turn it on, and it automatically connects to Verizon’s 3G network and enables its built-in access point.  From there, all you have to do from a computer is connect to its WiFi network and you’re online. If that computer is setup to automatically use the MiFi’s network it all happens without any additional button clcking. Very easy, very slick.

The Wi-Fi router has WPA/TKIP security turned on by default, and the SSID (access point name) and security key are printed on the bottom of the MiFi for easy access.  It also supports WEP (yuck) and WPA2/AES, so all flavors of 802.11b and 802.11g are supported.  The SSID and security key are both customizable.  It also offers many of the other options you expect to find in modern Wi-Fi routers.

This is not my first mobile Internet access device.  I have had several over the years.  In addition to tethering from my cell phone for the last 3 years, I have also purchased a T-Mobile EDGE PC Card and Cricket USB modem.  T-Mobile’s EDGE has worked well aside from the relatively slow speeds compared to those of 3G networks, but I have been on a plan that allows unlimited data for just $20 per month, so I haven’t cared much.  In most areas T-Mobile’s EDGE is a lot faster than AT&T’s (220-250 kbps downstream has been typical), so it has been quite usable.  In the right geographical areas it actually feels fairly snappy.  But tethering is a little inconvenient compared to a USB device, so I picked up the Cricket USB 3G modem a few months ago.  It has worked pretty well, but Cricket’s network covers a pretty small area compared to other carriers.  When I arrived in Lubbock this last week I knew that Cricket would not be available so I borrowed an AT&T 3G card, but its reliability and performance were disappointing at best.  So I picked up the Verizon MiFi about a week ago and I have loved it. 

Setup would have been a breeze if the first one I got actually worked.  It powered on, but didn’t show up at all in my Device Manager when connected to my laptop.  A quick phone call to Verizon’s tech support confirmed my suspicions that I had gotten a bad one, but one more trip to the Verizon store and a swap with another resolved that problem.  Once I plugged in the second one it showed up as a CD-ROM drive on my computer with the setup software.  Once installed and launched, a click on the Activate option got it up and running.  It only took a few minutes from start to finish.  It would have been nice to not have to go through the activation process, but considering how easy it was, it isn’t much of a hurdle.

It has been so incredibly easy to use and reliable that during the 2nd half of my trip I gave up using my hotel’s Internet access entirely in favor of access via the MiFi.  The connection in my hotel was fast, but went down fairly consistently.  I happily gave up the higher speed in favor of something that just worked reliably, and the MiFi and Verizon’s 3G network did not disappoint.  It just worked everywhere I went.

So far performance has been excellent.  Below is an actual speed test for the device I performed just prior to starting this blog post.  The download speed is actually faster than my Internet connection at home, though the higher latency (ping time) makes general web browsing feel noticeably slower.  Upload speeds are quite good for a mobile device (cable modems, for example, usually max out around 0.38 Mbps), though much slower than my connection at home.  Upload speeds really aren’t very important in a mobile environment, though, so I can’t knock it too much.  I don’t believe that this should be considered as someone’s sole Internet connection, but it sure makes a great mobile alternative.

I haven’t really tested its Wi-Fi signal range, but I have read online that it is good to about 30 feet.  Plenty for its intended purpose.  In most cases it will probably be sitting just a few feet away.

The device, as great as it is, isn’t quite perfect.  Because of its small size the battery isn’t very large, and as a result about 5 hours is the maximum anyone can ever expect to get out of it before needing to plug in.  Fortunately, though, it can be charged using an included USB cable.  I found the battery to drain faster than the quoted rate, but a lot of that is due to the fact that I’m a heavier user than most will be.  The more data transferred the shorter the battery life will be, so I expect shorter run times.  As always, YMMV.

It takes around 12 hours to charge via USB, but that same USB connection also provides Internet access for the connected computer while plugged in.  Unfortunately, plugging into a computer via USB also disables the Wi-Fi radio so ONLY the connected device will have access to the Internet.  This is not true of the AC power adapter, however, which both charges the device and allows the Wi-Fi router to remain active.  Charge time is reduced to 7-8 hours via the AC adapter, but that is still a long time.  (One down side is that while being charged the device is always powered on.)  It would be nice if the charge time was less than (or at least the same as) its battery life, but I guess I won’t complain too much.  How often will I need 5 hours of Internet access in one sitting while mobile anyway?  And being able to use the device via USB makes the battery life limitation seem a lot less confining.  It also has a mode that emulates a USB network adapter, so unlike many mobile devices I don’t have to manually dial a connection to get online when plugged into USB; just plugging it in gets me connected to not only the Internet but also my home network via VPN all automatically.

The other big complaint is with the wireless access plan that Verizon offers. At $60/month it is the same as the other major wireless carriers, but it still seems a bit too high for just 5 GB of data.  I would be a much happier customer if that rate were cut in half.  Verizon offers less expensive plans (a monthly plan offering 250MB for $40, or an on-demand plan at $15 per day), but neither of these is really very useful unless a device like this is going to be used very rarely.  (Unlike AT&T, streaming video from the Internet is not prohibited by the Verizon contract, making the need for 5 GB all that more significant.)  I think that anyone willing to invest in this thing is probably going to fall into the crowd where need of 5GB of transfer is more likely than either of the other options.

So far I have not been able to get my phone to connect to it either.  For some reason it doesn’t even see the MiFi’s wireless network at all, and even when I manually enter the information necessary to connect it still won’t make a connection.  Not a huge deal since my phone already has its own Internet access, but it is confusing nonetheless.  I’m going to have to investigate that one a little further.

The purchase price was $150, less a $50 mail-in-rebate with 2-year contract.  Not great, but not terrible either.  The 2-year contract has a $175 early termination fee (which goes down by $5 per month of active service).

Overall I highly recommend the MiFi2200 to anyone who needs mobile Internet.  It offers so much more functionality than other mobile broadband devices, and does so without much of a price premium. It just doesn’t make a lot of sense to get anything else right now.

Sunday, July 26, 2009

Apple Tablet: Small Mac, or Big iPhone?

There are rumors circulating that Apple will be releasing a tablet device sometime early next year.  There are certainly a lot of Apple fans that are very excited about such a device, even though they don’t even know what it will be.  (Can you think of anyone besides Apple that can get people excited about something that doesn’t even exist (and hasn’t even been announced) yet?)

There is very little information to go on at this point.  The rumors seem to be indicating that it will be a 10.1” touch screen device priced around $799.  Other than that we really don’t know much, including what it will do, or even whether it would run a full version of OS X or a modified version of the iPhone OS.  There is a rumor that the device will have some sort of cellular radio for Internet connectivity as well, but again, none of this is confirmed.

But even amidst Apple’s perpetual silence on future devices, I think there is a lot about it that we can conclude, should such a device actually come to pass.

The pricing alone could tell us a lot.  If it is priced at $799, it is $200 below the price of the white MacBook.  And about $200 over the selling price of the iPhone (price to carriers, not consumers). That alone tells me that the device will be one of two things: either it’s going to be a lobotomized netbook, or a large multimedia device.

I come to that conclusion based on the products that Apple already has in its lineup.  Think about this: the current Mac “netbook” is the MacBook Air.  In a lot of ways it is much like the PC netbooks that are on the market: small device, lightweight, low power, missing devices like optical drives and myriads of connectivity.  In fact the specifications on PC netbooks aren’t that far off of the MacBook Air, aside from the 13” screen that the Air offers where the screens on netbooks are usually 9-10”.  (Even the CPU isn’t that different between the two.)  An Apple netbook would have to be essentially a smaller, even more stripped down version of the Air at a lower price.

But we know that the device will have a touch screen.  That adds to the cost of a device.  And Apple won’t be happy if it doesn’t support multi-touch, and multi-touch capable touch devices are more expensive than the touch screens used on Tablet PCs.  For Apple to be able to release an OS X-powered computer that offers decent performance, plus a multi-touch touch screen, the price is going to be somewhere near where the MacBook Air is now, if not higher.  Nobody would buy it.

The other problem with trying to go the OS X route is that OS X just isn’t designed for a touch screen interface.  I develop software for touch screens and have learned a lot about what works and what does not work.  Menu bars, like the one that stays at the top of the screen all of the time in OS X, are totally unusable on touch screens.  Buttons, in order to be clickable, must be at least 3/4” wide and high.  A typical button in the OS X user interface would only be about 3/8” high on a 10.1” screen, making them too hard to press accurately.  A screen that size is simply WAY too small to even consider doing general purpose computing.  Even the 15” screens we use as part of my business are too small for that without software being designed specifically for that application.  Applications on OS X are not.

This, of course, ignores the fact that a tablet device lacks a keyboard.  Much of what people do on computers is based on having a keyboard.  People, especially Mac users, use their computers to browse the Internet, read email, write documents, edit photos, listen to music, and watch and create videos.  As we have seen with the iPhone, browsing the internet can be done on a keyboard-less touch screen device, but it is usefulness is limited.  Composing email without a keyboard is totally impractical.  And editing photos and creating video are both difficult (at best) on a low resolution screen, especially when the likely low capacity hard drive of the device is considered.  That leaves us with browsing (sometimes), watching video, and listening to music.  What does that list of activities sound like?  Yep.  There’s your iPod Touch/iPhone functionality.

Yes, I know the iPhone has an on-screen keyboard.  And that works okay for creating short text messages, or even short emails.  But for composing larger emails or documents, a touch keyboard just won’t do.  People like having the tactile feedback of actual keys to press when typing, especially as keyboards become larger than the screen on an iPhone.  A decent size on-screen keyboard on a tablet would fill more than half of a 10” screen, and that doesn’t leave any sort of room for software to run.  You couldn’t even rest your fingers on the screen because the act of just touching the screen would activate the capacitive touch sensor, so you’re left hovering your hands above the display.  This becomes very tiring very quickly.  Short of the tablet device being a netbook with a real keyboard, I just can’t see Apple trying to run OS X on a device this size. 

I think the prospects of an Apple tablet being based on the iPhone OS, however, are much higher than something based on OS X.  Nearly everything sort of falls in line with what we know. 

Since tablet devices are, by their very nature, touch-based, it would make a lot more sense for Apple to start with a product that is already based on touch.  OS X is not, and it would take a major overhaul of not only the OS but all of the applications that run on it to work in a touch screen environment.  The iPhone OS, on the other hand, is totally designed around a touch screen.  Touch, swipe, pinch; all of these are the fundamental operations that take place on a touch device and they’re already supported on the iPhone. 

One of the problems with the iPhone and iPod Touch is the small screen.  Watching videos on something that small is not fun, especially if you are trying to share content with someone else.  You can’t comfortably have a group of friends crowd around an iPhone to watch a video; it makes a lot more sense to take turns.  Or for Apple to release a device with a larger screen. 

An iPhone-like device with a 10” screen could be a very good multimedia player.  It’s big enough for the kids to watch in the back seat of the minivan.  Or large enough to watch a video comfortably on an airplane.  Or perhaps even large enough to become an ebook reader that competes with the Amazon Kindle.  (Yeah, battery life wouldn’t be as good, but I think most people are used to charging their electronics every night anyway.)

Creating such a device wouldn’t be without its own set of hurdles, though.  Something with a 10” screen absolutely has to have a higher resolution screen than the iPhone, so all of those apps in the App Store aren’t going to work without significant reworking to fit the larger screen (or look absolutely horrendous after being blown up to fill the larger display).  So should the mysterious Apple tablet be iPhone-based, expect that it will be limited to off-the-shelf iPhone capabilities for a while after release until developers have a chance to rewrite their software to fit the new screen.  But it would surely have music and video playback as well as web browsing built-in from the start.

But one of the biggest indicators to me that something like this will be iPhone OS-based is that Apple has a hole in their lineup of multimedia devices.  You can listen to music and watch videos on tiny devices like the iPods and iPhone, or something big like a computer monitor or TV by using the Apple TV.  There is a class of devices between the iPod and computer that is missing.  Apple wants to sell you iTunes content, but they don’t have anything that competes with portable DVD players.  The largest portable device (aside from laptops) for watching iTunes video is the iPhone. 

Also consider this… there are rumors that Apple is in talks with Verizon for a partnership for mobile data for the tablet device.  $799 might be a little steep for the average consumer.  But if the device were to be tied to a mobile data plan (like the iPhone is), that $799 might come down to $299 or $399 with Verizon contract.  That puts the out-the-door price right in line with the iPod Touch and Apple TV, and iPhone.  All of which target the same demographic they are already catering to.

I really don’t see Apple releasing a device based on OS X if it is strictly a touch screen.  And such a device certainly wouldn’t cost $799.  There is already a Mac-based device with a tablet display, the Axiotron Modbook.  It’s very expensive, and its only interesting to a very small segment of the market.  A $799 multimedia device is much more appealing to a nearly infinitely larger group of people.

The other thing that really leads me to believe that this will be a multimedia device rather than a computer is that tablet computing is not something that the masses are interested in just yet.  Microsoft’s Tablet PC features are very well implemented in Vista and Windows 7 and yet those devices are only being picked up by a very select group of people.  Outside of the world of doctors, salespeople, and maybe a few in the construction industry, a tablet computer just doesn’t make a lot of sense.  And among that group, a device at $799 might as well be $1999; they’ll pay whatever it costs.  Trying to target people for a $799 tablet just doesn’t make a lot of sense for Apple or its shareholders, especially considering how much they like their premium product markup. 

Since Apple likes to charge a 50-100% premium for their products, we ought to look at what’s in the market that would compete with a $799 product.  Yes, netbooks fall into that category, but not tablet PCs.  The other thing we find in that segment would be… you guessed it, portable video devices.

All of this coupled with the total lack of rumors or information leaks of Apple’s upcoming Snow Leopard having any sort of support for a touch-based Mac, and the rumors of an unidentified device running the iPhone OS lead me to believe that there is absolutely no way that a touch screen 10” tablet device is going to be running OS X as we know it now.  Aside from creating a brand new OS family for such a device, the only choice Apple has is to base something on the iPhone OS.  It makes perfect sense, while all of the other possible options just defy logic.

Saturday, July 25, 2009

To Netbook, or Not To Netbook, Part II

Yes, I have written about netbooks before. But it keeps coming up and people keep asking me questions. So here’s take two.

For about the last year or so there has been a huge craze around “netbook” computers. They are selling like hotcakes. Personally, I don’t think I’ve ever purchased any hotcakes, but I’m told that they set the standard for product sales. Nevertheless, netbooks are very popular. But before you consider buying one yourself, it might do you well to understand what they are and aren’t good for.

Simple Tasks, Simple Machines

Netbooks can be manufactured and sold at low prices because they use lower cost (slower, often older) parts than regular notebook computers. The price you pay (in addition to the lightening of your wallet) is performance and usability. Netbooks are fine for some tasks, but are horrendous at others.

Simple tasks like reading email and browsing most web pages usually work fine on netbook computers if you can live within their attached limitations. Personally I don’t find their slow processors to be much of an issue (most of us don’t really need fast CPUs), but they have other limitations that might be show stoppers.

Tiny Screens

The most significant for most netbook computers is the screen. Not only are the screens small, but they are of low resolution so you can’t display very much on screen at a time. In fact, if you install a couple of toolbars in your browser, nearly HALF of your vertical screen space will be used up with buttons, menus, etc. even with your browser set to Full Screen mode. That’s an awful lot of space being taken away from the web site you wish to view. Email might not be much better. The Preview Pane in Outlook only shows about 4 lines of each message… hardly useful at all.

Dell and HP both make netbooks with High Definition screens, which are MUCH better, but these are rare, must be specially ordered, and add noticeably to the price tag. (Don’t expect to pick one of these up at Best Buy.) The screen resolution on netbook computers is typically 1024 pixels wide by 600 (or 576) pixels high (that’s 0.6144 megapixels for anyone counting). The width is fine, but the height can become a real limitation. We’ve become used to high resolution monitors on our desks, and so have software developers and web page designers. Everything is designed around larger screens, so expect to do a lot of vertical scrolling no matter what you’re doing on a netbook. And please don’t expect to be editing photos or videos on screens that small unless you happen to enjoy pulling out your hair.

Storage

Storage may or not be an issue as well. Many netbooks come with Solid State Drives (SSDs for short). These use memory chips to store your data instead of rotating hard disk drives. While being more resilient to vibration, SSDs are much more expensive than hard disk drives for an equivalent amount of storage. So to keep prices down, netbooks with SSDs have VERY small amounts of storage when compared to models that have hard disk drives. One of the main virtues of more expensive SSDs is that they can be much faster than hard disk drives, but don’t expect any wonderful performance in the low-end SSDs used in netbooks. My advice: the only real reason to consider a model with an SSD is to lengthen battery life. Otherwise I recommend going with a traditional hard disk drive instead.

Keyboard

One last limitation to look at is the keyboard. Some models are okay. Others stink with a capital S. The keyboard on my Dell Mini 9 is infuriating. In order to make a keyboard fit on a small device Dell decided to relocate the apostrophe/quotation key to its bottom row. Each time I go to add a ‘s to the end of a word I die a little inside. Other keys have been moved too, and it’s really hard to adjust back and forth between the keyboard on the Mini 9 and a regular keyboard. Most other models tend to be much better about this, but I strongly recommend taking a look at a netbook’s keyboard before considering a purchase. Any funkiness in the layout will be sure to enrage later on.

Memory

Netbooks also nearly always come with 1 GB of RAM or less. This is probably fine for people who run one or maybe two programs at a time, but it probably isn’t enough if you’re someone who regularly keeps 5-6 applications open all of the time.

Saving Money… Really?

The most common reason for people buying netbooks vs. a notebook computer is their low price. (Some truly buy for portability, and I won’t argue with that… much.) But I’m not sure that the apparent lower price is worth it.

A typical usable netbook configuration is going to be in the $400 ballpark. Sure you can get one for a lower price, but you have to give up something to get there. For between $400 and $500 you can also buy a reasonably-spec’d full-size notebook computer. Yes, the prices have dropped that much. Most manufacturers offer at least one or two models in that price range. And you get a lot more. Faster processor, more memory, bigger hard disk drive, larger (higher resolution) and higher quality screen, more external connection ports, bigger battery, just to name a few. If you’re considering a netbook because of its price, you’d probably be a lot happier buying a notebook instead, without denting your wallet too much more, if at all.

It also isn’t uncommon to find that low price netbooks run the Linux operating system instead of Windows. This is strictly a cost-saving measure. Linux is free, Windows must be paid for. But unless you’re already a Linux guru (in which case you probably wouldn’t be reading this post), skip it. It isn’t worth it.

It’s Cute ‘Cause It’s Little

Size is a different issue altogether. Netbooks are indeed smaller and lighter than notebook computers. They typically come in at around 2.5 pounds, and around an inch thick. They tend to be much smaller in person than you expect based on pictures online. Modern small notebook computers, on the other hand, are usually 4.5+ pounds, have a much larger footprint, and are just over an inch thick.

While netbooks are small, they aren’t tiny. If I were a woman I wouldn’t carry one in my purse “just because.” I certainly don’t take mine everywhere I go, but when I need something to browse the internet, check email, or answer a customer support call, my netbook is my preferred device because of its small size and light weight.

Time to Buy?

Now is probably not the best time to be buying a netbook; it might be better to wait until late this fall if you can. The reason is Windows 7.

Windows Vista does not run well on netbook hardware. Vista's hardware demands are a little bit too much for netbooks to handle, so netbook manufacturers have been shipping the machines with Windows XP instead. Windows XP runs pretty well. But Windows 7 runs very well on them also, often better than XP.

Windows XP was discontinued before the netbook trend even started, so it hasn't been updated or optimized for netbook hardware. Windows 7, on the other hand, has been. It knows how to handle and optimize for the CPUs and storage devices used in netbooks; XP doesn't. I have been running Windows 7 on my netbook for several months and it works wonderfully! It has fewer issues than XP ever did, and the performance is just as good, if not better. It was also much easier to setup on W7 than XP because I didn't have to go find drivers for the hardware that was too new for XP to recognize.

Many regular notebooks being sold with Windows Vista are eligible for free upgrades to Windows 7 when it ships in October. But netbooks are left out for two reasons: (1) Windows XP, and (2) pricing. Windows XP doesn't qualify for the free upgrades, and computer manufacturers are actually paying for the Windows 7 upgrade on behalf of Vista users. Since the price on netbooks is so low, there isn't enough markup in any netbook for the makers to pay for that upgrade, even if they did include Vista.

Presumably sometime this fall netbooks will begin to ship with Windows 7. The official release of Win7 is October 22nd, but manufacturers are free to decide when they will begin to put 7 on the machines after that date. Some will probably act quickly, while others drag their feet. But either way, I really think it is worthwhile to wait for Windows 7 instead of running Windows XP now.

Should I Get One?

If size and portability are more important to you than power and capability, a netbook might be the right thing for you. I don’t think price alone is a good reason for one, though.

One last bit of advice: don’t make it your only computer. You’ll probably learn to hate computers if you don’t have something else to work on regularly. Netbooks are supplementary devices, not designed to be anyone’s primary machine by any stretch of the imagination.

Using a netbook can be sort of like trying to ride a scooter on the freeway. Yeah, it gets you to your destination, but it probably won’t be very fun getting there.

Thursday, June 18, 2009

Desktop

What does your computer desktop look like?

Here’s a pretty typical example of mine…

desktop Except I hate when my text goes blurry like that.  Oh well.

Sunday, May 31, 2009

Show us your computer!

Just for fun, show the world your computing work/play environment. Whether that be an actual desk, or a comfy couch where you do your computing, show the world what works for you.

DJWorkstation

Here are the rules:

1. Copy and paste this blog post into your own blog without modifying the rules.

2. You aren’t allowed to clean it up for the picture. We want to see real-world setups, not “I hired an interior designer just for this photo” pictures. No cleaning, no reorganizing. Everything just the way it is right now, no matter how messy, clean, organized it may be.

3. Tell us who prompted you to post pictures of your workstation. Ideally, post links to their blog post where they posted their own photos. Paste below:

(Since I started this, I can’t post any links to anyone else, sorry!)

4. Tell us anything you want about your setup. Why you work where you do. Anything unique about your computing environment. Paste below:

I’m definitely not neat when it comes to my desk. It always has piles of stuff on it, as you can see.

I’m a definite computer multitasker. And since most of the software I use regularly takes up a lot of screen space I work with multiple monitors all of the time. Here four of the available 6 are turned on and running, with the old CRT monitor and LCD TV making up the other two when needed. The monitor on the far right is a touch screen. That’s the display for my Mac off to the left.

In addition to doing software development, I also run my little recording studio and video editing setup from here. That explains the acoustic foam, audio equipment, and piano. I also have Blu-ray and full surround sound setup in here for those time I watch movies while I work.

The actual computer itself is in the room next to the room pictured here, so you won't see it. That way any sounds the computer makes (fan noise, beeps, etc.) are totally inaudible here. I had to run a 2.5" bundle of various extension cables through the wall to pull that off, but the room is completely silent that way.

5. We prefer to have YOU in the picture too, but if you can’t pull that off, we still want to see where you work.

6. Paste in a high resolution image if you can. Not so high that we can read your checking account number on your monitor, but high enough that we get an idea what you’ve got going on technology and space-wise.

7. Post on your blog, and share with the world!

8. Tag friends and other blog readers to do the same.

9. Paste a link to your blog entry on my blog in the comments.

10. Optionally, post a link to your blog entry back in the comments section of the blog entry that started it all, http://doubledeej.blogspot.com/2009/05/show-us-your-computers.html.

Saturday, May 30, 2009

Viruses are NOT a Technology Problem

There is a myth that has been going around for YEARS that if you run Windows on a computer that it is automatically going to become infested with viruses. It is perpetuated by many, particularly in the “I’m a Mac, I’m a PC” ads, but also by the companies that create anti-virus software in hopes that you’ll buy their product to protect yourselves from the inevitable technological intrusion into your virtual computer space. And most of us buy into it. The truth is, that it is NOT true that running Windows will guarantee that you’ll become infested with viruses. (I’ll prove it later in this post.) Windows in and of itself is not the problem. The problem isn’t even technological at all. It’s social.

The term used to describe the techniques used by viruses writers to get their software onto your computer is actually called “social engineering.” Basically it means they trick you into installing the viruses on your computer. They’ll do things like disguise their software as something else that you’re likely to want or want to see. They use methods to make you believe that these things are coming from trusted sources, like friends or family. Combined, those are pretty effective methods. (And truthfully, these same methods work on ANY operating system; they aren’t specific to Windows.)

This might be a blow to the ego of some, but if your computer has become infested with a virus, it is because you let it install itself. You opened a file you shouldn’t have. You installed some software you shouldn’t have. You are the one to blame that it is there. Please don’t blame your computer. Don’t blame your operating system. You did something that let the bad stuff in. The wolf knocked at your door, and instead of replying with a “not by the hair of my chinny-chin chin” you said “come on in.”

Personally I don’t run anti-virus software. I never have. I do install it, because that’s what you’re “supposed” to do, but I don’t let it run scanning and watching my computer all of the time. After I install it the very first thing I do is disable it. I don’t like the slowdown that comes with having everything I do be monitored by bloated software that isn’t going to find anything anyway. And despite the fact that I do not run antivirus software, I have NEVER had a single virus on ANY of my computers. Ever! I’ve been running Windows for nearly 15 years and I haven’t had a virus yet. I’ll run anti-virus scans every once a while just to make sure that I’m still clean, but NONE of those scans have EVER found even a single virus.

If susceptibility to viruses was a technological problem with Windows, my computers would be massive infestations of virus muck. They wouldn’t be usable. And they’d be out there trying to find ways to infect others. How have I been able to remain clean? Just by being careful about what I install and keeping my computer up to date with security patches. That’s it. No more. No magical hardware firewall watching my Internet activity. No magic fairy that shows up in the middle of the night to clean off anything that may have arrived that day.

But the situation gets even worse for the theory that Windows inherently becomes infested with viruses when I tell you that I also don’t run any firewalls. Yep, I turn those off too. And here’s another kicker… I break the cardinal rule of data security: three of my computers have public IP addresses (meaning they are totally exposed to, accessible from, and visible to the Internet). Gasp! That’s an absolute security no-no! Nobody should EVER run Windows with a public IP address, right? Well, I wouldn’t recommend it for most people, but the truth is that Windows, despite its many flaws, is not the primary cause of viruses becoming installed on our computers, so I really don’t worry about it. Viruses are installed by people, not their operating system. It’s people tricking other people into installing their ill-intended garbage that gets computers infected.

I’m not the only one that doesn’t run anti-virus software. In a recent episode of the Security Now podcast, noted security expert Steve Gibson also admitted that he doesn’t run it either. If a security expert doesn’t run it, then the computer he’s using isn’t the main cause of the problem, is it!?

So why do Windows PCs so often have viruses? Mostly because they’re so popular. If you’re someone conjuring up evil plans to take over the world by creating virus software, who are you going to target? The 90% of computers running Windows? Or the 7% running a Mac, or 1% running Linux? Which offers a better return on your time investment?

Windows XP also made an easy target because it makes it so easy to install software. No password or validation required to do an installation; installers can just run and do whatever they please whenever someone starts them. (That has changed with Vista; passwords and validation are required there, just like OS X and Linux.) Not requiring a password to install has never been a good idea, but it isn’t the cause of viruses on computers. It just made it easier for the bad guys. Big difference. And viruses are software; they just have a different intent than something like Firefox.

With all of this said, I will not recommend that most people run without anti-virus software or a firewall. Most people should take those steps to protect their machines. But these tools are just extras layer of protection; they should not be the only form of protection used. Neither will ever be able to make up for all of the shortcomings of someone using a computer. Even with both installed, it’s still up to you to avoid the bad stuff. And that, my friends, is a social problem, not a problem with technology.

Thursday, May 28, 2009

Get a UPS

No, not one of the guys (or gals) in brown.  I’m talking about Uninterruptible Power Supplies.

Back-UPS ES

UPSes are battery backup systems for electronics.  In this case, computers specifically.  I know you’re thinking, “Why would I want a battery backup for my computer?”  The reasons might not be obvious, but there are many. 

Having a UPS keeps the computer from shutting off abruptly

One of the worst enemies of your computer and its data is an abrupt loss of power to your computer.  If you’re working on a document or composing an email, for example, and the power goes out, you’ll probably lose your work.

In addition to that, in order to make computers perform as well as they do, modern operating systems like Windows, OS X, and Linux don’t necessarily save your data to your hard disk drive immediately.  They’ll hold that data in memory (in a cache) until it is convenient to save it to the hard drive.  Just because the computer says it has saved your file doesn’t mean it has actually done it.  It’s going to wait until it’s good and ready.

Having a UPS prevents both of these problems from occurring.  If the power goes out you and your computer will have a few minutes to save your work before the battery is exhausted. 

Having a UPS can prevent damage to your computer

This includes both physical damage due to power surges, brownouts, etc. as well as damage to the data on your computer. 

UPSes usually have surge protectors built in, and they also monitor the AC power in your home for problem conditions as well.  If there is any sort of problem with your power the UPS will kick in and switch the power going to your computer from the wall outlet to its built-in battery, preventing physical damage from occurring.  Many computers and other electronics have been saved from destruction because of the protection provided by a UPS.

Perhaps a more frustrating problem is what can happen when a computer is shut off abruptly rather than being shut down gracefully.  Computers don’t like to be turned off without a proper shutdown procedure; data on your drive, including your operating system and the programs you use regularly, can easily be damaged by an abrupt loss of power.  Files are very often damaged when this occurs, and it can result in a computer that won’t start, generates error messages, or crashes.  It’s an easy thing to avoid by just adding a UPS.

Having a UPS will make your computer more stable

Many of the crashes and lockups that we experience with our computers are due to problems with the power coming into them.  For example, if your air conditioner or refrigerator’s compressor turns on to cool your home or food, that generates some huge spikes and drops in the power on your wiring.  Likewise with washers, dryers, even hair dryers.  Those spikes and drops get passed on to the electronics in your computer, and can easily generate anomalies in the way your data is processed and stored.  You’ve probably seen it in its extreme before… the lights go dim momentarily, and your computer locks up or resets.  But that is an extreme example.  Even the smaller spikes, surges, and drops in power can modify the way your computer behaves.  And it may not show up as a problem on the computer right away.  The data that has been modified might not be accessed until later, at which point the computer may lockup or crash minutes, hours, or days after the problem really occurred, and you’ll never know why. 

In my own experience, many times computers that misbehave without a UPS suddenly start working perfectly after a UPS has been added.  I’ve seen it time and time again.  This is especially true of budget computers, where they have cut corners on the internal power supply in order to keep costs down. 

If your computer seems to randomly misbehave, there’s a chance it is because it is running on bad power.  A UPS will fix that.

Having a UPS will make your computer last longer

Without having to deal with problematic power, the electronic components that make up our computers will last a lot longer.  And I’m not just talking about preventing immediate damage from power surges; the everyday noise that is present on our power lines does damage over a long period of time.  Running your computer on a UPS will increase its life noticeably.

What about laptop owners?

The very nature of laptops makes UPSes less necessary than they are for desktop computers.  Since they have their own battery they’re automatically immune to power outages. 

But that doesn’t mean that laptop owners won’t benefit from a UPS at all.  The other issues mentioned above can still apply, like instability and damage due to dirty power; laptops are affected too.  And it isn’t a bad idea to put your DSL or cable modem and network router on a UPS to prevent damage and improve reliability there either. 

I’m convinced… now what?

It’s important when buying a UPS to get one that is properly sized for the amount of equipment that will be plugged into it.  And you need to decide how long you want the computer to be able to run on battery power for those time when the the power goes out completely.  Don’t expect a lot; 10-15 minutes would be considered generous without spending a fortune.  To save a little cash you can get one that will last you somewhere between 7 and 10 minutes.

Personally I have been using UPSes by APC for about 15 years, and I love them.  They aren’t the only game in town, that’s for sure, but I do trust their products.  And they have a tool on their web site that makes it easy to find the right UPS for your situation. 

Most office supply stores carry UPSes.  If you buy one there, expect to spend $50-150 depending on the size you need.

If you’re just going to run your modem or router on a UPS, buy the cheapest brand-name UPS you can find; even the smallest capacity will run these devices for quite some time on battery backup power.  Laptops are more power efficient than desktops, so they can run on small-capacity UPSes as well.

If you do get a UPS, please make sure you set it up according to the manufacturer’s instructions.  If you don’t install their software, for example, the computer won’t know when the power goes out, and it will be shut off abruptly when the battery dies, exactly like what would happen if you didn’t have a UPS at all.  Connecting the UPS to your computer and installing the proper software will allow your computer to know when the battery is about to die, and give it a chance to shut down properly.

The cost of a UPS is easily offset by the replacement cost of any equipment that they might save over its life.  And that doesn’t include time and frustration saved on your part due to lost files, crashes, or repair of any damage to your operating system and software. 

I, of course, have UPSes on all of my critical computers, and it has saved me a great deal of frustration over the years.  So I highly recommend them for anyone.

Saturday, May 23, 2009

Pet Peeve #367

<RANT>

It’s a minor thing, I know, but it still bugs me when people use my computers and maximize windows, especially web browsers.  I have large, high resolution monitors on most of my machines, and it’s really hard to read paragraphs of text when each line is 18 inches long.  It’s too hard to figure out which line of text you’re supposed to read next.  Most web sites don’t even work right when they are that wide, or leave huge columns of blank space on both sides of the screen, wasting tons of space.

And yet it seems like nearly everyone who uses one of my computers does it.  I don’t know why, other than force of habit.  It irritates me that they do, and especially that they leave it that way when they leave.

</RANT>

Sunday, May 10, 2009

Putting Things Into Perspective

Every once in a while I think back over the many years I’ve been involved with computers, and with the rate that technology improves it seems like a virtual eternity since I got started. 

The first computer I used regularly was nearly the size of a refrigerator, and it had its own dedicated room in our house.  Today my cell phone is much, much faster, and has 800 times as much storage. 

Storage

That refrigerator-size computer (a DEC PDP-11, which cost as much as a small house) had two disk drives, each one 5 megabytes in size.  The disks were physically about 18” in diameter, about 9” tall.  Swapping the disks wasn’t something we did often, as it basically brought the whole computer to a halt, and the disks were swapped out by opening large drawers.  Not that it mattered to me.  The disks were too heavy for a kid my size to pick up.

The second computer I used regularly had no permanent storage at all at first.  When I would write a program, I’d have to copy it down by hand onto paper to keep it, then type it back in later to re-use it.  Eventually we got a cassette recorder that let us save programs on cassette tapes.  Small programs would take a couple minutes to load or save, larger ones could take twenty minutes or longer.  And a good percentage of the time loading a program would fail, and we’d have to start over again.  Can you imagine waiting 20 minutes to load a program today?  And since computers didn’t multitask, it wasn’t like you could do something else while you were waiting for something to load.  You just had to wait, staring at a blank screen, with horrendous noises coming through the TV speaker.

Those cassettes held approximately 80-90 kilobytes of data.  A single average photo taken on a digital camera today would have taken between 20-40 cassettes worth of storage at that rate.  And would have taken 8-16 hours to load.  That is, if computers in that day would even have been capable of displaying them, which they weren’t.

Eventually we got a 5 1/4” floppy disk drive for that computer.  It stored about the same 90 kilobytes of data as the cassettes, but it was much more reliable and it only took 3-4 minutes to load larger programs, with the smallest ones only taking 30 seconds. 

When I finally got my first hard disk drive in 1989, it cost $600 and it “only” stored 30 megabytes.  Hard disk drives today that hold 33,000 times as much cost less than 1/12 as much (adjusted for inflation).  Even cell phones these days have more storage.  My cell phone, for example, has a little over 8 gigabytes of storage.  That’s 267 times as much as my first hard drive, or 88,000 times as much storage as those first cassettes.  Ironically I was never able to fill up my 30 MB hard drive; both programs and data were much smaller in those days since we didn’t (couldn't) store pictures, music, or video.

RAM Memory was a different story altogether.  My Atari 600XL had 16 kilobytes of memory.  A modern computer today has 125,000 times as much memory.  We later upgraded to a model with 64 kilobytes of RAM.  In 1987 I got my first computer with a whole megabyte of memory.  Today we use computers with 2-4 gigabytes of memory… Only 2,000-4,000 times as much.

Performance

Performance has seen huge improvements as well.  My Atari 600XL had an 8-bit processor that ran at 1.79 MHz.  Computers today use 32 or 64-bit instructions and typically run at 2-2.5 GHz and have multiple cores to nearly double that performance.  So in addition to the clock speed being more than 1000 times faster, their efficiency is leaps and bounds better too.  Machine instructions that took multiple clock cycles to complete are now completed in one (or even less than one) cycle.

Graphics

The very first computer I used was not capable of displaying graphics at all.  Everything was done with text.  If you wanted to simulate graphics it had to be done by using the symbols you find on your keyboard.  Pretty crude.

The next computer could display 4 colors at once.  But I was fortunate, because that model could select which colors those four colors were.  The IBM PCs of the day, even if they were capable of color, were restricted to four colors pre-selected by the video card, and they couldn’t be changed. 

Of course we couldn’t measure anything in megapixels.  We were lucky to be measuring in kilopixels. 

Photographic quality images didn’t come around for more than two decades.  In the late 80s there were some specialized programs that enabled my Atari ST to display some near-television-quality images, but it was done with trickery, as the computer was technically only able to display 16 colors at a time, and this was very atypical of the time.  And even then the graphic resolution was low enough that we consider those images to be “postage stamp sized.” 

3D images were something that were only found in labs, and even then, mostly restricted to wireframe images.  Pixar was still years away from doing their first 3D animation.

The idea of computers being able to display motion video was preposterous.  Television quality video was more than two decades away for consumer-level machines.

Audio

The sound capabilities of computers have improved dramatically too.  We started out with just beeps, if anything at all.  Beeps turned into crude music synthesizers with noise generators.  Which eventually gave way to more sophisticated synthesis.  Finally, nearly two decades after I started using computers, computers were starting to become able to play back recorded sounds, like music.  Not that it mattered much at the time, because computers in that period didn’t have enough storage or processing power to really play anything longer than a few seconds in length.  It wasn’t until nearly the turn of the century that recorded music really became plausible.

Communication

There was no “internet” when I first started with computers, at least not in the way we think of it today.  It wouldn’t become popular and well known for another two decades.  Our communication options were limited to direct computer-to-computer connections over dial-up modems at 300 baud.  The average DSL connection today is 5,000 times faster.  Sharing pictures, music, or video was totally out of the question.  Even if we could have done anything with them, the most basic images we use today would have taken days to transfer. 

With no Internet, there were no Internet Service Providers.  About a decade into my computer adventure BBSes started to pop up (bulletin board systems).  They were similar in concept to the forums we see on web sites today.  But only one or two people could be connected at a time, and all conversations stayed within the confines of a single BBS, which meant you could really only talk to other people in your local telephone calling area.  World-wide communication was nonexistent.

Peripherals

The inkjet printer didn’t become available until I had been involved in computers for more than a decade.  Most of us, if we had a printer at all, used only dot-matrix models (the really noisy and slow printers that are nearly extinct today).  Instead of replacing ink cartridges we replaced inked ribbons.  I used to have to save up my allowance and wages from my paper route to be able to afford either one.  Color printing was unheard of for the first decade, and even then it was pretty much limited to colored text.  Printing graphical images was, let’s say… pathetic.

Mice were nearly unheard of for the first decade I was working with computers.  They didn’t become common on IBM PC compatible computers until Windows became common another 5 years after that. 

Digital cameras were still more than 20 years away from becoming available commercially, and 25 years away from being known to the public.

The idea of a display monitor was actually something materialized during my time with computers as well.  Previous to that, either punch cards, or line printers (with keyboards), or more recently than that, an integrated terminal that had both a display and a keyboard were used.

Scanners were virtually nonexistent.  Networking was virtually nonexistent as well.  Affordable wireless networking was more than 25 years away.

Surprisingly, keyboards haven’t changed that much.  In fact, the keyboards we had on many of those first computers were better than the ones we use today.  The only thing that has changed significantly about them is the addition of ergonomic layouts, and specialized keys for multimedia functions.  Oh, and wireless is relatively common.

CD-ROM drives didn’t come about for about 15 years.  DVD was two decades away.

Portability

The first “portable” computer I used was the original Compaq Portable, released in 1983 (this was how Compaq got their start).  It weighed 28 pounds.  But that was 6 years after I started programming; prior to that there were no “portable” machines. 

The first computers we considered “laptops” were nearly two decades away.  The idea of something like a netbook or iPhone even would have been science fiction.

Computers Introduced In My Lifetime

Actually, I should clarify that section title a bit.  These are all computers that have been released since I first got involved in the field.  I actually remember when most of these were released, or at a very minimum, I remember using them when they were still very new.

  • Apple II (1977)
  • IBM PC (1981)
  • Apple Macintosh (1984)
  • Atari 8-bit series (1979) and ST series (1985)
  • Commodore PET (1977), 64 (1982) and Amiga (1982)
Operating Systems
  • MS-DOS
  • Mac OS
  • Windows (all versions)
  • Mac OS X
  • Linux

Google Search