If you’re out in the Bay area or on the other coast in New York or Boston, it’s pretty easy to be smug about your culture of risk-taking, pool of top talent, and strings of successful, world-changing innovations. But as the world continues its acceleration to one that’s increasingly connected and ways of collaborating make distance irrelevant, smart people will pop up everywhere and I’m convinced we’ll see a flattening of the geographic advantages these pockets of innovation represent.
Six of us were bugged that there was so much going on in Internet and Web technology innovation right here in Minnesota, that when I suggested we start our own blog to showcase that innovation, there were nods of agreement and a willingness to dive in and make it real.
The biggest reason we were all interested in this blog is that these showcases and interviews are what we wanted to read and there wasn’t anything like it out there.
The result is Minnov8: Minnesota Innovation in Internet & Web Technology. This past weekend was the biggest Barcamp yet, Minnebar, and over 400 people showed up to present, learn and participate. Rather than recreate everything on this blog, why not take a peek at Minnov8? This and this post are ones that will recap what took place.
Wherever you live and whatever space you care about (e.g., technology, education, greentech, etc.) and where there are a critical mass of people willing to leap in and work together as multiple authors, I’d encourage you to start one of these…it’s pretty simple to do and fun to boot.
Over the past few years, I’ve been in numerous discussions about how social media (and blogging in specific) is driving a new level of transparency in marketing, public relations and corporate communications, while also providing unprecedented opportunity for thought leaders to carve out a niche in new and powerful ways.
In my consulting engagements when talk comes around to discussing crowdsourcing and ways to spark creativity and innovation through social media means, Apple often is brought up as an example of how to innovate: “We’ve got to create an iPod” is often brought up as a successful innovation.
Often this occurs without much talk of how Apple really succeeded with it by focusing on the entire value chain. Nailing the value chain was the secret sauce in delivering a three-tiered value chain offering by tying that iPod to a desktop application (iTunes) so people could rip their CD’s and manage their music, alongside that same application (iTunes) acting as a Web hosted application (iTunes connected to an iTunes Store). Then they offered this whole package up to an industry on its knees as its product (music) was being stolen out from underneath them.
But then I’m quizzed by clients. “Hey, wait just a dang minute Borsch. You’re promoting and pushing us to be transparent and let employees blog when a company you laud, worked for and own stock in is polar opposite?” Apple is a different beast that needed to be opaque since they were close to being out of business in the 1990’s, but the problem is they haven’t changed direction about their lack of transparency now that they’re a resounding success.
I’ve been troubled by that paradox until just now.
Ever since Mac OS X Server shipped in 1999 and the desktop version in early 2001, many Apple and enterprise I.T. watchers have pontificated about Apple possibly moving into enterprise sales in a big way and making it a focus of effort.
On a scale of 1-10 (with “10” being hyperfocused on a strategic market), I believe Apple’s interest in the enterprise is a “2”.
My friend, Graeme Thickins (blog; business), sent a few of us an eWeek article today entitled, “Apple Goes Enterprise.” The authors premise? That enterprise I.T. is clueless unless they seriously consider Mac OS X Server and Apple’s Xserve or Xsan hardware for their server room due to the world-class aspects of these products and his argument was on the merits of what Apple offers.
I’d agree they’re worth a serious look, but I see one huge caveat to this article from the point of view of someone who was a manufacturer’s rep for Apple in the early 1980’s, worked again for the company after Jobs came back in 1996 for three years, was in leadership positions in the enteprise software space (e.g., Vignette; Lawson Software) and have thought long and hard about what Apple is up to while simultaneously knowing what it takes to kowtow to and please enterprise I.T. folks.
The enterprise wants every conceivable feature and typically places their bets on technology momentum, a new class of product or a vendor if they deliver a corresponding support infrastructure (i.e., a vendor that invests in support for enterprises specifically) or demand is off-the-charts high. Currently Apple’s support for the enterprise is modest…at best…and many of Apple’s former resellers (who could support the enterprise) are gone due to the Apple Store juggernaut.
In a January 2000 Fortune magazine interview, Jobs said this about Apple’s new directions — including any sort of focus on enterprise sales — in response to a line of questioning about why they wouldn’t pursue the enterprise after Apple’s reenergized and growing sales as well as the then well accepted “jelly bean shaped” iMac:
Today’s Macworld “Stevenote” was interesting and brought back memories. In the Spring of 1984, I was in San Francisco for the Apple II Forever rollout of the Apple IIc, a small desktop machine which was quite compact but still needed its little green screen monitor (I was with a manufacturer’s rep firm out of college and Apple was our major line…and this was before they hired their own direct sales force).
Though some are already pointing out some of the Macbook Air’s shortcomings (e.g., non-user replaceable battery; only one USB port) I still have to admit to being amazed at the power we have in our hands compared to what I’ve lived with as the personal computer industry has evolved.
My biggest pump today? The iPhone’s new software. I’ve already bookmarked some Google Maps locations and created two screens of oft-used web sites that I’ve “clipped” and made into icons to instantly go to a page…and the exact part of a page all zoomed in and so forth (see the demo here).
Take a walk down memory lane and watch this video from the Apple II Forever event in San Francisco in April, 1984 and you’ll see that all the cool stuff announced today is but a milestone on the way as we walk into our technology future.
Tonight I spent some time roaming around inside the Internet Archive and came across the video below from a San Francisco public television show, The Computer Chronicles.
Here’s what it says at the Archive for this video,
“It wasn’t quite the World Wide Web yet, but everybody started hearing about this thing called “the Internet” in 1993. It was being called the Information Superhighway then. This program looks at the earliest stages of the Internet including Aladdin Systems SITComm, a Macintosh communications program for Internet access, and the WELL (Whole Earth Lectronic Link), an early online community.
Also featured is a visit to the former Bell Labs in New Jersey (now Bellcore) for demonstrations of internet based teleconferencing, video on demand, ISDN, and optical network technology; a preview of the World Wide Web as used at NASA; a visit to where it all began, ARPA, the Advanced Research Projects Agency in Virgnia; and a look at the Internet Multicasting Service in Washington, the first Internet radio station. Guests include Brendan Kehoe, author of “Zen and the Art of the Internet”, Howard Rheingold, author of “The Virtual Community”, Dr. Robert Kahn, former found of ARPA, and Carl Malamud, author of “Exploring the Internet”. Originally broadcast in 1993.“
Take a peek at this now 14 year old video and realize how far we’ve come…and where we’ll undoubtedly be 14 years from now as the rate of change accelerates.
After posting yesterday on the “Top Five Reasons that Leopard will be Apple’s tipping point“, it just reinforced in me that the depth of passion on the various sides of the computing table still exists (Windows; Mac; Linux).
You know what I think is happening that’s actually accelerating fanboy-dom and compelled so many people to comment? Those of us on the ‘net, and extensively using computers, have our faces in them for more total hours than ever before. With the explosion in laptop sales — most of which have Wifi cards in them — these tools are being schlepped all over the place and used as the general purpose devices they’re meant to be. All day, every day we’re accessing Web 2.0 sites, using applications, editing video and audio, communicating through Skype and webcams and much more.
So people naturally invest dollars and then themselves emotionally in their chosen computing platform. Most of us customize our device with wallpaper, sounds, applications and — because of ever larger hard drives and our extensive use spawning more digital files than ever — we structure and archive our digital lives with these devices. So it stands to reason we all have heightened awareness of our time investment as well as being intellectually invested in learning how our computers work and where stuff is located. Having anyone intimate that your choice is wrong and that you’re a schnub for choosing your computing platform is like whacking at a hornet’s nest.
As I write this I’ve had 60,000 pageviews of that post, nearly 80 comments under it and more than 500 comments on the Digg submission. Too many to address individually (and many are trolls being anonymous and nasty and are not due a response).
Windows and Linux have many, many compelling features and attractive aspects and run on cheaper hardware. But to me the obstacles and barriers to using them as productively as I do the Mac are too high (and, in fact, I have Ubuntu Linux and WinXP on this Macbook Pro now). Using these other OS’es cause me to spend a lot more time “twiddling bits” than being creative and productive.
I’m pleased with the platform I’ve chosen (Mac) and the result is I’ve invested many thousands of dollars in dozens of machines, applications, training and knowledge. Having used all three platforms extensively for many years, my chosen one allows my staff and I to be more productive and our output is remarkable ranging from color print to ebooks; video; audio; and much more. Our tech support need and hours invested in twiddling bits is now 10% of what it was when we were running Windows. The bonus is no adware. No spyware. No viruses.
At all the major tech conferences I attend, all the alpha geeks and the fashionistas are walking around with Mac’s (hmmm….am I a geek or a bon vivant? I know what my kids would say!). I often find myself in the front of an audience and scanning the crowd of people with their laptops is interesting…but there is an accelerating number of glowing Apple logos at these places where influencers meet.
I’m getting together with a friend and colleague tomorrow for one key reason *and* because he too took delivery of a new Macbook Pro last week. This is a guy that has used Windows machines his entire life.
Even the dyed-in-the-wool PC guy, Chris Pirillo, has gone Mac.
My 18 year old daughter works at a local Apple Store and we talk often about how busy the place is all the time. Whenever I go there, it’s a constant and steady stream of purchasers. She’s indicated that this back-to-school season is “awesome” with sales even though no analyst would buy more stock due to THAT recommendation.
It’s funny…when I bought my Macbook Pro some time ago I purchased Parallels and an OEM version of Windows XP for $49 (and I threw away the $10 sound card I had to buy to get it) thinking that I’d need to continue to run Windows apps. You know what? In six months I’ve opened it up about 10 times. I play with my install of Ubuntu Linux more than I use Windows.
Why is this happening? The platform works; it’s elegant and quiet; based on unix; secure with no spyware or adware; great user interface; perfectly positioned for user generated content; compatible with Windows in many ways; in short, about anything people want to do today (other than Windows-centric proprietary applications) can be done with a Mac.
This weekend I’ve had a lot of time to reflect on how fragile and dependent we all are on infrastructure and distribution. All of these thoughts have also had me remember being a kid during the 1970’s energy crisis (with lines at gas stations) and times when pending blizzards caused people to make a run on grocery stores stripping the shelves of certain foods, water and other staples.
We’re far too dependent on so much infrastructure and distribution systems that most of us either take for granted or simply expect will always be there.
The Interstate 35W bridge collapse two weeks ago was the first stunning blow about the frailty of infrastructure and a wake-up call for all of us. I’ve been reading a tremendous amount about needed bridge and roadway repairs in the US and it seems as though every state (as well as the Federal government) is suddenly taking action.
Yesterday morning’s storm here in Minnesota knocked out our power at 3am Saturday and it’s expected to fixed by close-of-business on Tuesday! It’s only affecting 45,000 people here now, so this is a local story. But what a pain in the butt it is to be without power and, thank goodness, my neighbors behind us have power so I’ve run a contractor grade extension cord so I can plug in my refrigerator and sump pump.
When there have been huge electrical outages there have been outcries (and I wrote about it here and you can read more about our crumbling power grid here). But since these outages have yet to be in the same horrific category as a catastrophic bridge collapse tragedy, not much is being done. I also remember Reddy Killowatt, the electric industry spokescartoon who encouraged us to use electricity. He’s retired now having outlived his usefulness in a time of energy conservation.
I’m sitting here in a restaurant this morning with free Wifi since my Internet access at home is out (no electricity…no working cable modem). Thankfully I have an office a short drive away with power so it’s not too horrible and I can still get work done, but my 12 year old son keeps asking me how he can get on the ‘net from home. There are some silver linings to having all the electrical stuff off, but I’m not too interested in living off the grid just yet.
Yesterday afternoon I almost purchased a portable generator. Instead, I’ll be buying a standby generator (which runs on natural gas) that I can plug into my home circuit system and prioritize my heat and air conditioning; sump pump (critical since my basement flooded last year when the power went out!); refrigerator; and a few other items so I’m not at the mercy of Xcel Energy or a kind neighbor. The cost will be roughly $5k installed and a whole house generator (instead of a limited number of items) is about $13k. What’s enlightening to me has been the exercise of adding up all the watts I pull in my house and realizing how tough (and expensive) it is to be self sufficient with energy!
The small outage of Internet hosting I referenced in my post is so laughingly small that it went almost unnoticed by the general public. But as more and more of us map our businesses, our social networks and our communications on the ‘net, the potential for horrific and catastrophic outages — though unlikely to take lives — may finally get people to wake up to our dependency on bridges, our distribution system, electricity, the Internet, and all the other systems and processes we now take for granted.
Yesterday my son and I (on our 8th Annual Dad & Son Adventure) drove into Chippewa Falls, WI since I’d never seen the town and was curious as to why Seymour Cray (seen at left below), the “father of supercomputers” placed the R&D arm of Cray Research there.
My 12 year old is like most: his eyes roll up when I tell him we’re stopping by a museum. I’ve learned to set hard limits on time (“we’ll only spend X minutes there and then decide if we want to stay“) which works well so I can always get him to agree to at least take a peek.
Turns out the museum wasn’t open officially yesterday, but they let us wander around on our own and read the signs within the small exhibit area.
In many ways looking at the early Control Data computers was like seeing a set from the movie Dr. Strangelove. But what was really fascinating to me was the early wiring diagrams for 1950’s era computers that were drawn by two women at drafting tables from the design specifications Seymour Cray put together. That, coupled with the mass of wires embedded in the early Cray supercomputers, seems incredibly inefficient by today’s standards.
What impressed my son — especially when I pointed out that the RAM in his Nintendo DS stored as much data as one of those HUGE 26″ platters seen in the photo with a CD and floppy disks for comparison. The sign read, “This 26 inch platter weighs approximately six pounds and holds a total of 4 million bytes of information, 2 million bytes (2 megabytes) on each side. (Abbreviated 2MB.)
The colored 3.5 inch disks currently in use today hold 1.44 megabytes each, so only three disks would be needed to store the same amount of information as the 26 inch platter.
A current recordable compact disk (CD-R, below) holds 700 megabytes (700MB) and weighs approximately one ounce. This CD will hold the same amount of information as 175 platters, which would weigh 1,055 pounds. No wonder these disks earned the name “compact” and are so popular!”
Since I was born in the late 1950’s, all of this evolution has occurred in my lifetime. I remember a guy across the street from me who was going to “get in to the computer business with this company called Control Data” which seemed pretty exotic in the late 1960’s. I still have some measure of sadness-from-afar at the demise of Control Data and how the computing business ended and Minnesota became a relative backwater in technology.
It’s important for kids to truly understand the evolution of computing as well as all the other things upon which we build our collective future. In the same way that people like Seymour Cray are not studied in schools, I think about inventors like Dean Kamen who have done amazing things too and are not lauded at all — but my guy knows about Kamen since we talk about his successes and his failures.
So how did our museum peek go? The proof was when we hopped in the car afterwards and I asked my son what he thought: “It was cool Dad…I liked it and those old computers were amazing.”
Often I take Robert X. Cringely‘s columns with a grain-of-salt, but this one entitled, “Game Over: The U.S. is unlikely to ever regain its broadband leadership” really hit me since I make my living on Internet-centric management consulting and view broadband as the key enabler of business going forward. Cringely’s article is an important one to read if you care about US competitiveness in the future.
Back in the mid-1990’s I had an ISDN line with a whopping 128kbps access for $69 per month. Incredibly fast at the time, I even considered their bonded option for 256kbps (well over $100 per month) but I wanted to stay married. Today I have 8mbps per second downstream and 768kbps upstream for essentially the same price.
I have friends in San Francisco with 10mbps symmetrical (both upload and download) for under $100 a month. Others using Verizon’s fiber (FIOS) and getting 15mbps down, 2mbps up for $50 per month.
But Cringely talks about the 100mbps speeds in Japan, others have complained about them being ahead of us too and the OECD’s April, 2007 report (which showed the US at 25th in global broadband penetration and speed) is open to debate. So is it important for us to have competitiveness in broadband speeds and why aren’t we — the inventor and creator of the Internet — in the world’s leading position for broadband speed and penetration?
When you think about the relative sizes of countries vs. US states, you begin to get a feel for the enormity of the problem. Japan is roughly the size of Montana, for example, and (as of 2001), 79% of the population lived in urban areas with ~20% in Tokyo alone. That makes it considerably easier to provide a high speed broadband infrastructure for the overwhelming majority of Japanese. It’s a lot tougher to do so across the vast geography that is the United States.
The stakes are too high, however, to NOT solve this accelerating need for true broadband. ArsTechnica has a good article on House Democrats and discussions about ‘true’ broadband. I’m not even going to get into the lobbying and politics of broadband, telephony and wireless, but suffice to say there are alot of complexities on why we’re NOT the world’s leader. What most discussions don’t focus on, however, is that broadband is viewed as a driver of gross domestic product (GDP) output and we need to be accelerating the Internet — both in speed and penetration — now.
What if a 1% increase in broadband penetration equaled 300,000 jobs? Read on for a very interesting set of data…