Category Archives: Uncategorized

Back in the saddle again

In keeping with my tradition of providing a tune somehow aligning with the blog entries topic. Check out this gem. And just for good measure some Plumbum extraction music.

Yes, after damn near a year, I am working again. I even managed to move away from design/design verification. This was by and away , one of the biggest (and most desirable) career changes. for me  After nearly 30 years, I desperately wanted to get out of mainstream engineering. That’s not to say there weren’t good times and projects in my role as a DV engineer. To the contrary, there are designs I have learned the internals of which truly amazed me. Early 80’s RISC processor design a-la Dec’s Titan, ECL RISC at MIPS, weighted RED queue management at Cisco, spread spectrum clocking at Sun. Scaring up a bug in an IO block at NeXT. A failed server attempt at Apple. I remember a set of schematics for a processor someone from DEC’s Western Research Lab produced which was a master class in cpu design, and clarity. As an aside, Titan, the discrete 100K el CPU was the fastest uniprocessor inside DEC from 85 to 89 (and maybe beyond).I even took a spin at evangelism at Microsoft of all places.

I was pretty fortunate back then, I got to rub elbows with some pretty smart (even for me) people. Notable and in no particular order, Anita Borg, Mike Nielsen, Jeremy Dion, Neil Willhem, Loretta and Brian Reid (at the time), Smokey Wallace, Forrest Baskett, Russel Kao, Jeff Prisner, Mike Powell, Norm Jouppi, Bill Hamburgen, Judson Leonard, Carol Peters, Bob Stewart  and on and on

The new gig with the multi-syllabic title is “IT Applications Analyst” meaning that I am now a professional agent of change,  working to move groups from one IT related paradigm to the next. The current project ? A new way to track bugs.

IT is a funny space, similar in some ways to DV in that if you do your job well, no one knows you exist. It’s only when a server crashes or the network goes away that you get any sort of notice. I’ve plenty of experience with that under-the-radar existence.

Being officially on the outside of core (HW) engineering product development, I get to see how engineering orgs are perceived. I have to admit, sometimes, we ARE the stereotypes we see in the media. Nonetheless, I’ll miss some parts there was something rather sublime about having singular stimulating focus on solving a single large problem or two.

Given the desire to commodtize the pschidt out of all things hardware  I am glad to be moving on though I hope that my former industry sector struggles through and begins to take up it’s role in leading innovation once again. I am pretty tired of 3rd and 4th order derivatives for Facebook, Google, and Instagram. And lord knows, I am looking forward to seeing people actually sell things again as opposed to being a “tax” on the internet.

Nowdays, things are much more about developing trust, understanding roles, and building relationships. A far cry in a lot of ways, from straight-ahead development.

Release the hounds…

Lost in Translation, Tablets are just fine even when sales are down

At the end of Q2 2014, I recall reading here and here  that while iPad sales were off , MBP and iPhone sales were up a little, or at least enough to cover the shortfall in iPad sales. A similar phenomenon was reported across the industry in general,i.e. laptop sales were up. This got me to thinking about tablets in general , and why they are brilliant devices  and at the same time being cast as the square peg destined for the elliptical hole by a lot of consumers.

As we collectively watch successively more  powerful tablets surface; yes, pun intended.I have  to wonder if people aren’t missing the point of these devices. Tablets are great devices for consuming.They aren’t designed to be real good at  producing. At least based on current tablet OS capabilities.OK,you can add an external keyboard and voila, a netbook-lite emerges. I offer that that tablets, as originally envisioned are just fine.They’re great for reading, simple message response, purchases;you know all the stuff that most of us do with them presently. They are NOT, however, great for writing the next great American novels, or any other long form content production.That’s ok.

I am not worried about tablets in their current incarnation, and you’ll never see an external keyboard attached to mine. That’s what my laptop is for.

Those early tablet producers (Microsoft included) had it right and Apple’s entry into market reinforced the notion. Tablets are great, portable , computing devices fulfilling a particular need. They are not however, a singular solution for all of ones computing. So let’s move on, it’s ok to whip out that laptop when you commence writing that great epic tome you’ve been ruminating on. I promise, I’ll download the e-version and read it on my tablet when you’re done.

 

Goodnight moon

Yo ! Relative hypocrisy

Within the last few weeks an app entitled “Yo” has become fodder for both the tech and lay media alike. Here’s Steve Colbert’s take on it.

Yo! really s a silly app. With that being said, I have to wonder if maybe there isn’t a little unintentional (or maybe intentional) hypocrisy at play here. For the developers of Yo! Are their aspirations any more trivial than those  aspirations of more “serious” apps or other tech endeavors?

As a mental exercise,allow me to offer some other spaces where our newly re-invigorated critical analysis skills might be brought to bear:

  1. The “New Economy”: Really ?, you mean to me tell there  has never been another attempt at providing free content to consumers underwritten by advertisers.Hmmm, ok, if you say so
  2. The solution for both gender and ethnic diversity in hi-tech: This is a new problem ?, one unique to hi-tech and therefore immune to other best practices in other industries ? Shut the front door!
  3. MOOCS and online education in general: I recently saw a report indicating that the so-called “under served” communities who were supposed benefit most from MOOCS and other on-line courseware actually have a lower completion rate than was originally expected. Why is that ? I also wonder if after taking all these online course folks at Google, Apple, Facebook etc.. will actually hire them. Last I checked, these three were still trolling the halls of Stanford and other elite universities for their candidates.
  4. The need for more “coders”: Coders to do what, and at what price point. If I waved my magic wand and made everyone a coder, what would that really do for the labor rate of these folks ?
  5. The proven “meritocracy” of the tech sector: Do you really want me to comment here ?

 

Finally, the funniest thing in all of this is that by all popular accounts, tech is a bastion of market forces. If we are to REALLT hold that as a fundamental truth then let the market decide the fate of Yo!. I mean, what are we afraid of ? That some completely ridiculous “product” might end up proving to be incredibly commercially successful ?

I have an idea, let’s try a little honesty in characterizing the motivations of our industry; “If it makes money, and doesn’t do significant harm, we’re all good right”

 

P.S. There’s a sister at Emory, examine online college/courseware critically. You can check her here

The Internet of Things 1.0 (No One Receiving); Now what ?

search.jpg

Here we are, somewhere beyond version 1.0 of the Internet of Things. I believe Eno’s song “No One Receiving” to be a fabulous commentary on the IOT initiative up to this point.

Remember those initial halcyon days ? with Google connecting everyone’s home power meter to the internet with the the promise of a bright and better future. Wow, what a glorious time to be alive it was indeed.

Seriously though, what went wrong ? why didn’t things take off from there ? What needs to be different this time around with the latest push into the cosmos of the IOT ?

In those early days, did anyone ask the consumer if they wanted all manner of electronic gizmos connected to the internet ?. It seems to me that if  IOT is to become ubiquitous AND USEFUL, engagement with consumers is crucial. IOT folk HAVE to ask questions beyond just whether or not the consumer thinks its cool. Is it necessary ? It’s pretty clear the first round of IOT fell victim to believing it’s own hype.

I recently read Francis DaCosta’s book “Re-thinking the Internet of Things”, it’s a great read and asks some tough questions about current thinking around IOT, DaCosta also provides what I believe to be a pretty good outline for a solution architecture. Is his vision destined to be the market winner ? maybe, it certainly seems more sensible than just allocating an IPV6 address to everything on the planet. On the hand, given my jaded outlook based on 30 years in the valley; where is it written that the  best solution portends market success ?

I guess my point to all of this rambling is that while it can’t be denied IOT is probably the next thing.The industry might really want to make sure they come correct this time, too many missteps might just derail the train.

The rewards are great…, and so are the risks

 

Back to Eno…

Update on  May 7,2014

 

This is precisely what I am talking about shouldn’t happen. IOT has to be intentional, well maybe not, maybe it can really be just a scattershot try-anything-and-see-what-sticks sort of affair. That might work for some, but I believe the winners, i.e., those entities who spend the least amount of time just trying something, anything, will be those who think things through.In this instance, a washer could be part of a useful IOT system if for instance you also stuff RFID chips into clothes, have sensors in the washer to read these chips, and then use that information to accurately set the wash cycle.

More to come…

 

Little Fluffy Clouds

I think from now on I am going to always try and include a link to a song, apropos to each. Or at least try to do so as best I can. This week, let’s run with Little Fluffy Clouds by the Orb.This song was released sometime around 1990, and was subsequently used in a VW commercial around 2000 which was when most folks became aware of it.

Wikipedia defines the cloud as follows:cloud computing in general can be portrayed as a synonym for distributed computing over a network, with the ability to run a program or application on many connected computers at the same time. It specifically refers to a computing hardware machine or group of computing hardware machines commonly referred as a server connected through a communication network such as the Internet, an intranet, alocal area network (LAN) or wide area network (WAN) and individual users or user who have permission to access the server can use the server’s processing power for their individual computing needs like to run a application, store data or any other computing need”. By way of this definition I can claim I had access to cloud computing resources as far back as 1983. Guess what ? that makes cloud computing older then the Orb’s song I mentioned at the outset of this piece, and in all of that time up to now, private clouds have been the coin of the realm. Gartner believes going forward this will be the case at least until 2020, or at least that is what the CEO of VMWare  claims Gartner says.

The enterprise community might find it prudent to invest in developing their infrastructure,and that goes counter to what passes for common wisdom in the “cloud” industry these days ;tuning it in ways that serve your concerns and business goals. Enable your people to develop skills and ideas which might give you the edge, the industry now calls them “doves” . Make sure however, you couple that investment with actual outcomes you can see and measure which enhance your business. Don’t get skittish just because you can hear public cloud footsteps in the distance behind you.

And then there is the intangible component of serendipitous innovation.If you have an advanced development IT team, preserve it and let it spread its wings to create, if you don’t, consider seeding one.

Public cloud providers will be driven to scale in order to be profitable while simultaneously having their margins  cut due to competition. Google was the first to shoot across that bow of engagement earlier this year (2014), ultimately that means going towards a model in support of the lowest common denominator. As an enterprise, are you sure that’s where you want to base your service/app deployment ?

You enterprise folk have been here for while with your infrastructure and expertise.Think hard before throw the baby out with the bathwater.

 

 

It’s not about Open Source, it’s about process

I can’t resist. This is one of the dumbest articles ever. The title alone “OpenSSL Heartbleed: Open Source Bloody Nose for Open Source Bleeding Hearts” just bespeaks volumes in terms of idiocy. A somewhat veiled attempt at poking at the efficacy of open source code. Don’t get me wrong, I’ll support a proprietary system in a heartbeat (no pun intended) if it’s good. That being said, this has more to do with design/code review processes than open source code quality. In fact, maybe I could have saved the 83 previous words I just wrote and barked “1994 Pentium Floating Point Bug”

Mike Judge is getting it right

As a 30 year vet of the valley I have to say Mike Judge’s HBO series Silicon Valley is getting it right on a lot of different fronts. He’s capturing the mania that is the valley these days.The preciousness, the arrogance, the “change the world via apps” mentality etc…

Mike is also capturing the lack of diversity (read no blacks , latinos, or women of note) in the valley. I am not being critical of Mike per se, though i wonder if his accuracy is intentional or by accident. In either case, seeing this lack of diversity by way of this show is really stunning. It’s something we should all consider as the valley culture continues to develop