Napster, Uber, AirBnB, and Bitcoin VERSUS Regulation

If you have Netflix and are a 90’s kid, watch the documentary Downloaded. The Napster documentary takes an interesting look into how a technology (peer-to-peer file sharing) single handedly brought the music industry to its knees. While watching it, I couldn’t help but think of two things: 1.) Shawn Fanning is the man (founder of Napster) 2.) several other industries are experiencing the same level of disruption today (taxi industry – Uber, hotel industry – AirBnB, and I would argue the financial payment industry – Bitcoin).

To give some quick background, when Napster came around, it was the music industry and its constituents that refused to understand and embrace the technology, and instead decided to sue the technology to high-heaven. This ultimately led to the downfall of Napster within two years of its humble beginnings, where its community reached as many as 60 million users (ridiculous). The music industry was colluding, i.e. all major labels were price-fixing the price of CDs at $15. It was the failure of an industry to innovate and diversify their business models that caused this technology to hurt revenues so much.

Now to today, all of the recent legislation and legal battles that this new wave of startups is facing, is similar to what Napster went through. Granted, technically, people were stealing music through Napster, but, the technology itself (peer-to-peer file sharing, decentralized file systems) was able to scale and could have been repurposed for a more legal use. The taxi industry, in one municipality or another, is fighting to keep Uber out. They are always finding these laws that have been setup over time that in some way shape or form make taxi-like services that aren’t sanctioned cab companies illegal. (Side rant: two reasons I despise our government. Too many laws; solution: for every law that we add, we get rid of another. Laws that are too protective of industries that disincentivize innovation. Technology is our friend, not our enemy). I feel like Uber has fought its way through a fair amount of regulation, legally, and will thrive.

The others AirBnB and Bitcoin are still in their infancy and possibly their first iteration of their service. The documentary brings this fact to light, where, although Napster failed, many copy-cats and enhanced legal versions of the technology popped up. Despite Napster failing, the technology and service survived, and ultimately thrived over the next decade (i.e. Spotify, Pandora, etc.). Despite the diminished success of AirBnB and Bitcoin, relative to Uber, don’t think that the technology will go away. Other startups will iterate and learn what worked and didn’t work for AirBnB and Bitcoin, and create a product that works. (Side note: I feel AirBnB will be successful, but it has much more legal mumbo jumbo to fight through, and ultimately change. Bitcoin is the wave of the future, yup.)



Today’s Unemployment/Underemployment Is Not Driving the Growth of “Uber for X” Companies

I ran across an interesting article on Bloomberg where the columnist argues that if it wasn’t for today’s “unusual” unemployment/underemployment (underemployment is where people are over-qualified for their current jobs), startups like Uber, Lyft, Instacart, Postmates, etc. would NOT have the large network of drivers that allow them to deliver quality on-demand services, thus driving their growth and valuations.

It’s absolutely insane to me that anyone can make this argument with any type of credibility. If the logic of her argument were to hold true in other recessions, she is saying that if these contract jobs were to have popped up then, these startups wouldn’t have survived. It’s ludicrous to think that if Uber would have existed in the 60’s, 70’s, or 80’s, that no one would have signed up to work for the company. I’m a firm believer that the American workforce will gladly do these “side-jobs”, especially when there is little to no qualifications barring them from making the extra side-cash. What makes these types of jobs extremely attractive for the long-term, and this isn’t just a fad as she concludes, is that workers get to utilize an asset (their car) that would otherwise be idle and costing them money. Irregardless of whether this is a historic time for underemployment or not, Americans will always be looking to make more money to pay for that extra something, fill a void of unemployment, or make ends meet more reasonably.

Coming from a tech startup, I may have my own biases, but ignorance shouldn’t be allowed in mainstream media. When non-tech people claim that an idea/company won’t work, it makes us tech people work that much harder, and more times than not, that same naysayer is singing that company’s praises years later.

Finding The Answers in Big Data

Ah, BIG DATA. In the last few years there haven’t been bigger buzz words than those. If you haven’t heard of it, then I hate to break it to you, but everyone and their mother have been talking about it, look at the Google searches over last ten years.

2014-08-26 21_31_01-Google Trends - Web Search interest_ Big data - Worldwide, 2004 - present

Working at an artificial intelligence startup, we’ve seen our fair share of clients/prospects looking for an alternative to all of the dashboards out there (Tableau, Domo, etc.). With these dashboards (just a fancy word for all of the data and charts about your business on one screen), they don’t scream at the user what they should know, or what business decisions need to be made next based on all of their current data. It takes interpretation and people are afraid to get the answers wrong.

As we continue on this path of “sensor-ized everything”, people and businesses will be able to collect more data on themselves and about others, to possibly influence decisions. Regular consumers with their FitBits and businesses with rewards cards and cookies in their websites; all of that data is now at their fingertips, but the question is, how do we turn all of that into actionable items? But more importantly, the correct actionable items?

Today companies are working to figure out how to interpret all of this new found data and how to act correctly upon it. Sure, you can throw all of the correlations, relationships, and other fancy stats at these new data sets and find which one leads more directly to increased sales. But the funny thing is, there is rarely one answer to this question. What people need are instantaneous perspectives and explanations as to what all of this means to their business without having to interpret the correct answer, and to have those explanations change as the data changes, to better help businesses understand the current state of their business.

The way I see it, there are a couple of answers to the “big data” problem:

1.) Businesses need tools to not only aggregate all of their old and “new” data, but a way to communicate that data, and its every changing properties. The only way to do that is through hiring people to dig into and communicate all of this data. But that is hardly feasible, given the capital needed to hire the necessary talent. That’s where artificial intelligence comes in.

Now, let me rant a little bit. Artificial intelligence can mean numerous things, and it means something different to everyone. “Deep learning” is one practice of AI, “machine learning” is another. People associate “algorithms” with AI. Heck, you could even classify the first chess program that beat a person as AI.

What’s different about today’s AI, however you want to classify it, is it can begin to understand the outcomes of it’s analysis and communicate it. That’s where Narrative Science comes in. When the world starts collecting more data, businesses should have systems/applications in place that allows the data to speak to us, instead of employing more resources to look at dashboards to give us the same insight.

Oh, and the second answer to “big data”, in my opinion, is good ol’ common sense.

The Singularity (Robots Take Over) and Artificial Intelligence

The Wikipedia definition of “The Singularity”:  “is a hypothetical moment in time when artificial intelligence will have progressed to the point of a greater-than-human intelligence, radically changing civilization, and perhaps human nature.[1] Because the capabilities of such an intelligence may be difficult for a human to comprehend, the technological singularity is often seen as an occurrence (akin to agravitational singularity) beyond which the future course of human history is unpredictable or even unfathomable.”

What most people think of when they think of situations like The Singularity, many people think of movies or fiction books (Skynet, The Matrix, 1984, etc.). Each envision a world where “the machines” are smarter than us and think on their own, without regard for the human condition, well-being. Famous entrepreneur and investor Peter Thiel, has  a particularly interesting take on all of this, much of which I agree with. To save you the time of watching the link, he says “all we [humanity] needs is The Singularity.” The reason I agree with the fundamental point of his message is that technology “usually” makes our lives easier, more efficient. More importantly, a new advancement in artificial intelligence would create new jobs where humans control the “smart machines”. Think of it as when machinery was first introduced to factories during the Industrial Revolution. The advent of machines in factories lowered the barrier to entry to the industry and encouraged new competition. For example, it took less “man-power” to produce thousands of garments in a day. Curating and maintaining a machine’s artificial intelligence will be the “factory job” of the future. This may sound odd, but that may be because artificial intelligence is commonly misunderstood.

By working at a startup that gets looped into the “artificial intelligence” realm/discussions, I’m well aware of its recent resurgence, popularity. And what’s comical/frustrating is the public’s view of what artificial intelligence is. We’ve made it a point in sales meetings to explain the different “flavors” of artificial intelligence. So let me set the record straight, artificial intelligence by and large can be: 1.) self-learning (machine-learning), 2.) mimick current human behavior, or 3.)  deep-learning. [Head over to the Narrative Science blog for more interesting pieces on this discussion.]

My larger point about point about all three types of AI is that, despite what you may think, all of these take human interaction and intervention. A computer has no sense of what is right and what is wrong, unless verified by a human. Computers can find interesting things in data (machine-learning), but only a human can verify that correlation equals causation. For example, a computer might find that the number of kiwis harvested increases fairly linearly with the number of deer killed each season in Wisconsin. This is useless, spurious correlation that we wouldn’t want computers to identify as significant.

Sure, there are ways to program and help a computer dictate with metadata what could be true/false. But don’t believe that these machines are learning all on their own, they need validation, which usually happens “off-line”, aka by humans (including Watson and every other piece of AI). That’s where jobs are created by artificial intelligence, as opposed to consuming the very jobs that people are concerned they’d be replacing.

Revolution in Batteries and Wireless Charging

If you didn’t know that the ‘battery revolution’ was upon us, then let Tesla’s announcement of its plan to build a multi-billion dollar battery factory be your wake up call.  The lithium-ion batteries that are in our always-on  phones are now moving to cars, as movement away from non-renewable resources grows.

Although this push for longer-lasting, more efficient batteries is the next logical step in the world’s dependency on energy everywhere, 24/7. But  I think the inevitable next step after “super” batteries would be no batteries at all, via wireless charging. Let me explain a little more.

There is a company that is keeping its technology close to the vest about wireless charging, a company called uBeam (there’s not much to the site). They’re aiming to build new infrastructure and networks that work much like cell towers and cell signals work today. Except they’d be emitting a certain frequency that a special sensor in your device can detect and initiates a charging-like actions. So batteries wouldn’t necessarily go away, but you’d be less tied to your power cord and a wall.

It makes sense that they may roll out an “in-house” product first, so you can wirelessly charge items in your house (and keep them charged, even with a weaker battery). At that point, a high-end battery would be rather useless if you could essentially be “plugged into the wall” 100% of the time.

Understanding Innovation……From the History Channel

This weekend I caught an episode of “The Men Who Built America” on the History Channel. The particular episode was about Andrew Carnegie, who began the “Age of Steel”, who started the Carnegie Steel Company; ultimately minimizing production time for building-grade steel from two weeks to 15 minutes.

Certainly a better process as well as a better understanding of the chemical properties of steel helped them mass produce it, and inevitably build out of America at rates never seen before. But it wasn’t smooth sailing to convince people that steel was the next material to build better, stronger, and bigger buildings, leading to the invention of the skyscraper.

His three main hurdles:

1.) Finding a better process to manufacture high-grade steel

Lesson learned: Research and development are the keys to the next innovative breakthrough. Easy problems to answer don’t make anyone money (besider the pet rock and Flappy Bird). Hard problems to solve have the biggest payoff.

2.) Convincing railroads, construction companies, and consumers that steel was the next precedent in buildings

Lesson learned: Marketing your new idea and trying to sell it sometimes makes you feel like a mad man talking to yourself in the corner of an insane asylum. It takes much longer than you think to persuade consumers, even the early adopters, of this new view/idea. Patience and persistence prevail when you’ve done adequate research and development.

3.) Going in all in when you feel the market is “turning”

Lesson learned: Once you begin getting traction in the market, don’t be afraid to be aggressive with your spend, because it will be “now or never” to make your vision a reality.

Mind-F**k Friday – “Quantum Computing”

Computers as we know them today, ones that use processors, were being developed and slowly enhanced since the early 1900’s. Slowly, companies started commercializing machines, including the IBM’s first accounting systems, which were all punch-card based. All they could do was add, subtract, and print the results. From the ’30s to the late ’50s, this was by and large how computers were used. It was a means of doing work, but it never replaced workers. These computers were the size of rooms and required multiple workers to use the machine correctly and efficiently. Nowadays, all of that (and a little bit more), is in our pocket.


So what’s next after what we use today (machines driven by processors and how powerful they are)? The next level is “quantum computing”. I would recommend Googling it because it’s mind-blowing and hard to explain in a few paragraphs. So I’ll keep the explanation short to give some background, but will focus on how game-changing it will be.

Today, computers think in binary, either 1’s or 0’s (think of the Matrix). What a quantum computer can do is think  in 1’s, 0’s, or 1’s AND 0’s at any length. So instead of making one decision at a time (1 or 0), it can make potentially millions at a time. Making computers a million times faster than they are today. Kind of a mind-fuck, huh? To continue, in today’s computers, processors (think Intel) create the 1’s and 0’s via electrical impulse (think motherboards, wires, etc.). Quantum computing will send impulses through particles in the air.

Now, just like the computer that we know today took a good 50 years to develop, the quantum computer will take the same amount of time to get to the consumer. But businesses are doing initial testing (particularly Google). What they’re using it for today is machine-learning. For example, Google is trying to build a self-driving car; within in this car are programs that need to be taught the rules of the road, what a red-light looks like, etc. etc. For a computer program to get the answer correct 100% of the time takes a lot of computing power to identify unique situations, environments, etc. A quantum computer could figure it out in seconds, then Google can put that more accurate algorithm/program in your self-driving car’s arsenal.

P.S. – A company producing these today are D-Wave Systems if you’re interested.

Selling Your Data Could Be the Next Source of Income

Between the NSA, Google, and advertisers in general, the amount of your data that you give away for free is astounding. Everyone seems to be mining your data nowadays, but why aren’t you getting paid for it. (Technically you sign away your “rights” to some of your data by signing any “Terms of Service” agreement”)

I’ve always envisioned a better way to locally store all information from your interaction on a computer, instead of making that data really only accessible to services (sometimes malicious). These services take your data to try to better understand you, or people in your demographic. But if they really want to know what sites I visit, what I click on, where I look on my screen, etc., you’d ultimately have to store that information locally. Then, you could go to an “exchange”, where advertisers, etc. “bid” on your data. For example, say they’re trying to better understand 25-year-old males. My information would come up, and they could pay me $XX to know everything I’ve clicked, visited, etc. in a certain time frame.

Well, that wait might be over to start selling that data of yours. Thanks to Datacoup ( With Datacoup, you essentially select which data you want to share (ranging from all types of social media to your checking account).  As I mentioned before, this data is worth millions, if not billions, of dollars to brands, advertisers, and data brokers. They’ll pay for access to the information that you care to share. Of course, what Datacoup is doing initially is strictly creating a better “marketplace” or “exchange” for this information. I think their next step should be some type of in-browser app that stores much of this data. Because today, if you think about it, these brands, advertisers, etc., can continue go through their traditional means of collecting your data (cookies, plug-ins, etc.). But until then, I think Datacoup has set a standard going forward for corporations to get to know more about a particular audience, instead of creating “back-doors” to access your data.

You could always “opt-out” of services that are usually inherent to programs, specifically your  web browser, by going here

Barring I don’t pick a perfect bracket and win  a $1 billion, this could be a nice second little income.

PS – This video of a guy who sold his data on Kickstarter, and makes some great points about selling your data and some metrics for context.

Twitter’s 3 Keys to Success

Twitter’s founder Jack Dorsey had a talk  a few years back that helps you step into the mind of a great entrepreneur. The talk is about 15 minutes long, but it’s an interesting talk that is straight-forward, but dwells on some great insight.

His three keys to success:

1) Draw: get your idea out of your head and share it,

2) Luck: assess when the time (and the market) is right to execute your idea,

3) Iterate: take in the feedback, be a rigorous editor, and refine your idea.

4) Then indirectly I would add a fourth, time and experience. The combination of the two will help you identify opportunities and see the world in a new light.

The Second Industrial Revolution (Part 2 of 2)

Continuing from an earlier post, the effects of this second Industrial Revolution won’t appear overnight, and when the shift is in full swing, new jobs will be created to “manage and maintain the machines.” In this post, we’ll take a quick look at what the possible economic effects could be.

In a world of machines doing entry level quality work, economic inequality could soar in such a world, but unemployment would not necessarily spike. If governments refuse to allow jobless workers to fall too far below the average standard of living, then you’d hope the minimum wage would rise steadily, and ever more workers may find work unattractive. On the other hand, the higher the minimum wage rises, the greater the incentive to invest in capital that replaces labor. Any new jobs that would be created would require skills and education that many mid-wage workers lack and this could contribute to a growing economic inequality.

So while technology might eliminate jobs in some older industries, as long as new technologies generate major new demand meeting new needs, the net effect does not mean permanent unemployment. Clearly some new technologies such as the driverless car will, indeed, address major unmet needs. In this process, specific jobs and specific occupations will be eliminated. This may increase economic inequality for a time. And the new opportunities will require new skills and new business models; these might be difficult and slow to develop. Nevertheless, this view of the future differs sharply from the predictions of a dystopia with permanent mass unemployment and ever-widening economic inequality. Yet the data show that the first wave of computer technology has displaced workers, not replaced them.

As long as technology continues to address major unmet needs, machines do not determine our fate. Just because machines take over some human tasks, that does not mean the end of jobs. We do, however, need to figure out how workers can develop new skills and how entrepreneurs can create new business models to generate the new demand that will provide growing employment.

P.S. – Learn how to code (for the hundredth time)