Tag Archives: Facebook

Surviving and Thriving as the Internet of Everyone Evolves to an Ubiquitous Reality

The Quantified Self (QS) movement began with fringe consumers obsessed with self-measurement, but today’s Internet of Things (IoT) – with sensors on and inside bodies, connected cars, and smart homes, offices, and cities – is expanding it to include everyone. Consumers will not have a shortage of devices or data to choose from anytime in the near future. Looking out further, to 2025, three specific factors will drive the technical evolution of the QS/IoT as a computing platform, each with implications for consumer relationships: improvement of individual devices; integration, from aspects of inner self to a holistic view of inner, outer, and extended self; and intervention in consumer actions.

  • Improvement: Before too long, gimmicky and overpriced devices will disappear from the market, while runaway hits will make headlines (and millions of dollars). From 2005 until now, sensors have driven QS – specifically, sensors attached to or focused on humans. An early example is fitness wearables, but they’re already a commodity; today’s Samsung, Google, and Apple smartwatches are a natural evolution. Bragi headphones now do health tracking; Samsung’s Artik platform, Intel’s Curie and GE’s GreenBean offer startups an easy way to create consumer IoT devices. Image sensors – cameras – enable gesture interfaces and new channels like lifelogging, where users of Twitter’s Periscope and Facebook’s Facescope live-stream their lives.
  • Integration: Fitness trackers and action cameras capture data on or next to consumers’ bodies. IoT technologies quantify consumers’ “inner selves,” and marketers can learn as much from them as they have by examining purchase histories, web surfing habits, and other digital footprints. Other IoT datapoints include vital signs from exercise, sports, and adventure wearables; food, from precision agriculture to smart utensils like HAPIfork, to microbiomes and Toto’s smart toilet; and medical bioelectronics, personal genomics, and mood- and mind-monitoring like Neurosky. The IoT tracks consumers’ outer lives of family via smart baby bottles and wearables for pets, and extended selves via connected thermostats, diagnostic dongles in cars, and image-recognition systems in stores and city streets.

Continue reading

Scalability is the Last Hurdle for Liquid Cooling to Change the Game for 50GW Worth of Data Centers

Late in 2014, Hong Kong-based startup Allied Control announced that its immersion-cooled server containers have 1.4 MW of server capacity and deliver a power usage effectiveness (PUE) of less than 1.01 (cooling only). PUE is the amount of energy expended for cooling, related to the data center’s server energy input, and is the de-facto standard for measuring data center performance; the theoretical optimal PUE is 1.0. As an example, Facebook’s average facility PUE is 1.08-1.1, whereas the industry average is approximately 60% higher (see the report “Blowing Hot Air – Uncovering Opportunities to Cool the World’s Data Centers“– client registration required). Allied Control stated that the containers are capable of controlling a 500 kW heat load and allowing for a high density of server racks (watt / ft2). We alluded to the potential of liquid cooling technologies for data centers early in 2014 (refer to analyst insight “Data centers need to get over their hot air” — client registration required), but with more competitors coming to the fore, we need an in-depth comparison.

At present, there are more than 3,500 data centers in 103 countries (see Data Centers Map’s study) that demand approximately 50 GW of electricity globally (40% more than New York City on the hottest day of the year). Based on this fact, operators are scrambling to reduce the facility PUEs of their data centers, as energy is a much higher share of operational cost (OpEx) for this building segment compared with office buildings, for example. While operating, servers in data centers produce heat that must be extracted to keep the servers running – akin to a power plant whose reactor vessel must constantly be cooled. Allied Controls’ PUE claims appear way ahead of even Google, however it is important to differentiate between cooling PUE and facility PUE. These two values differ depending upon where the boundary is drawn around energy use – cooling PUE only considers cooling energy, while facility PUE examines electrical losses in distribution cabling and even at the substation. Google, which operates some of the largest data centers, has reduced its data centers’ PUE drastically over time, from 1.21 to 1.12 (facilities larger than 1MW), keeping well below the industry standard of 1.8-1.89 (see figure 3, “Power Usage Effectiveness (PUE) is Trending down Year On Year” — client registration required). Google is able to achieve these low PUEs with strategies such as hot-aisle containment, air-based free cooling, and direct evaporative cooling. On the other hand, small-scale data centers (smaller than 1 MW) often use refrigerant-based computer room air conditioning (CRAC) cooling systems (see the report “Blowing Hot Air – Uncovering Opportunities to Cool the World’s Data Centers” — client registration required), which drives PUEs much higher. However, these systems have high costs (average of 100 W / ft2) and a major issue with DX systems is their limited scalability. Over the past three months, we profiled companies that provide liquid-based cooling systems with claims of reduced PUE and cost, as well as increased capacity, density, and scalability. Rather than using air as a heat transfer medium, they use fluid to extract the heat from servers via heat exchanging plates or submerge all electronic components (server motherboards) into their dielectric fluid.Fluid cooling works for a manufacturing facility or power station, so on the surface this approach seems reasonable for a data center. The performance of the liquid cooling products, however, is all over the map, as we show in the table below.

Insight_1_18_15

1 Compared to air-based CRAC systems
2 Optimistic claim per our estimate
3 Based on case studies
4 Theoretical

Firstly, their heat transfer mediums (dielectric fluids) have different characteristics. Iceotope (client registration required) relies on 3M’s dielectric fluid, branded as “Novec,” to submerge the motherboards whereas others use proprietary dielectric fluids. When we spoke with executives at these startups, companies like LiquidCool Solutions (client registration required) and Green Revolution Cooling (client registration required) (GRC) claimed their proprietary fluid performs better than Novec; however, they could not substantiate these claims. “Max. Coolant Temperature” is the maximum temperature of the cooling loop into which server heat is dumped, and it varies widely among these companies. The gap of 8 °C (between 45 °C and 53 °C) is an important difference because systems that use higher “Max. Coolant Temperature” require less energy input, which has a direct impact on the cooling PUE value. Despite having the highest “Max. Coolant Temperature” level, GRC has the highest PUE level among its competitors. Its important to note that Iceotope’s is one of the few which actually installed pilot projects and has calculated a PUE based on a case study, whereas others’ are companies’ claims or their theoretical values. Despite the fact that Allied Control uses the same dielectric fluid as Iceotope, it claims to have a lower cooling PUE; Allied Control has to prove its claims with a case study or pilot project. To compare server density, Iceotope boasts the highest density, however, its two major problems are its yet unstructured pricing and limited ability to scale for larger data centers. During the interview, Iceotope stated that it targets small-scale data centers and based on that reason, scalability is not their primary focus. Among the remaining competitors, GRC has the highest density with competitive pricing. Another advantage of GRC is high capacity and a plausible “Usable Floor-Space Increase” claim whereas Clustered Systems’ claim is overly optimistic per our estimate; liquid-based cooling systems do not require ducting and fans inside a data center which increases usable floor-space.

According to one survey, existing data centers’ facility sizes are equally divided (“Data Center Facilities Are Equally Divided by Facility Size” — client registration required), however we are seeing the most growth in the mega-scale segment – for example, Google’s recent decision to build a massive 120 MW data center (client registration required). Developers and owner-operators consider many factors in citing a new facility, such as climate, electricity supply security, and availability of human capital. A prevailing trend is that large data centers are often built in cool climates, such as Google’s new data center north of The Netherlands; however, especially hot climates need mechanical cooling, as they cannot depend on free (or evaporative) cooling alone. Singapore is a high-interest area for data center developers. It is for this reason that liquid cooling holds strong potential – it can help ease the energy demand of data centers reaching into new areas with inhospitable climates that necessitate mechanical cooling. Clients are advised to monitor liquid-based cooling technologies. In particular, clients are advised to monitor the progress of Green Revolution Cooling; the company offers a scalable product with high density and competitive pricing, and has intentions to expand to regions with high data center growth. Lux will continue to monitor the developments in this market, with a keen focus on scalability.

Google and Facebook’s Drone Strategies, from Buzz to Breakthroughs: The Sky’s the Limit

The technology world is abuzz with the recent announcement that Google is buying Titan Aerospace, a maker of high-altitude unmanned aerial vehicles (UAVs) that Facebook had only recently been considering (it bought Ascenta for $20 million instead). Ostensibly, both companies are looking at UAVs (also referred to as “drones”) as an opportunity to deliver Internet access to the roughly five billion people who lack reliable land-based access today. But that goal still leaves many people wondering about the business rationale – how will billing work, who will pay to advertise to the unconnected masses, and what are those technology giants really up to anyway?

To understand why content providers are spending billions on drones, you have to think about their long-term strategy. Recently, there was a huge defeat for Google and other content providers in a ruling about what’s called “Net Neutrality.” It basically says that landline and mobile carriers like AT&T and Verizon can start charging more for people to access certain sites, even though they swear the action will not be anticompetitive. So, for example, you might have to pay the carrier extra to see YouTube (which Google owns) or Instagram (which Facebook owns) or Netflix or Amazon Prime movies. In fact, just in February Netflix struck a deal to pay Comcast, which supposedly is already showing faster access times, but has not stopped the partners from bickering over unfair competition and exertion of power. Also, AT&T has a $500 million plan to crush Netflix and Hulu, so the competitive backstabbing has already begun.

Where do drones disrupt this strategy? Most obviously, having their own networks would allow Facebook and Google to bypass the domination of wireless and wireline carriers (like AT&T and Verizon in the U.S.) whose business practices – e.g. knocking down Net Neutrality – are geared towards throttling content providers like Facebook, Google, and their partners and subsidiaries like YouTube. Need more bandwidth? New neighborhood being built? Blackout? Natural catastrophe? Launch more drones – and expand service in hours, not years. Drones serving network connectivity allow Google, Facebook, and Amazon to bypass the toll lanes – and, incidentally, make instantly obsolete the landline infrastructure that their enemies Comcast, AT&T, and Verizon have spent decades and tens to hundreds of billions of dollars building out. Connectivity in emerging markets is a feint – look for delivering content in the developed world to be the first battle, and call these Machiavellian strategies the “Game of Drones.”

Could this really happen? Both drone technology and wireless connectivity technology are relatively mature and work well. Both are still improving every year of course, and it is possible to deliver some connectivity via drones today. However, more innovation is needed for them to be commercially viable, and future incremental development will be about integrating and improving parts, so more people can have more bandwidth with greater reliability and lower cost. For example, the engineers might integrate the broadband transceiver antenna with the drone’s wings (as Stratasys and Optomec have tried — client registration required) which could eliminate the cost and weight of a separate antenna, while allowing the antenna to also be very large and more effective. Drones’ needs could drive development of battery chemistries that outperform lithium-ion (client registration required), like lithium-sulfur (client registration required) from companies like Oxis Energy (client registration required). High-performance composites and lightweight, lower-power electronics technologies like conductive polymers (client registration required) will also be key.

What’s next? One of the most obvious additional uses would be to attach cameras, and use them for monitoring things like traffic, agriculture, and parks, even finding empty parking spaces – things that an AT&T repair van can never do. Maybe the drones become telemedicine’s robotic first responders (client registration required), sending imagery of accidents as they happen, and swooping down to help doctors reach injured victims within seconds, not minutes. While these examples may seem far-fetched, it’s really very hard to say exactly what they will be used for, only because our own imaginations are very limited.

Within the autonomous airspace space, there’s much more flying around than just glider-style UAVs. For example, Google’s “Project Loon” has similar stated goals of delivering internet access. The new investment in Titan does not necessarily mean Google is leaving lighter-than-air technologies; it’s just that Google has already invested in that technology and is now looking at other aircraft platforms for doing similar things in different environments. Investments in small satellites from companies like SkyBox and PlanetLabs are also taking off. And of course, there are Amazon’s delivery drones – rotary-wing UAVs more like helicopters: speed and navigation in small spaces are important, and they need to carry the weight of packages, so they need to be small and powerful.

Each of these technologies has spin-off effects – both threats and opportunities – for companies in adjacent spaces, such as materials or onboard power. Only batteries or liquid fuels are dense enough energy sources for rotary-wing aircraft, while Google’s Titan and Loon aircraft are more like glider planes or blimps: big, light, and slow, just staying in roughly the same place for hours, days, or even years. Solar energy needs a large area for collecting solar energy, so big glider and blimp drones can use solar. Technology providers in these areas stand to gain if more companies deploy their own UAV fleets.

So, UAVs are an important strategic technology for both companies, even if the money-making part of the business is far off. Yes, someday you might have a Google drone as your ISP, but that’s not the primary business case behind these investments today. Google and Facebook need to make investments in these airborne platforms for the same reasons that countries did 100 years ago – to defend their territory, metaphorically speaking. For example, Nokia should have done a better job launching smartphones before Apple and Google, and Kodak should have launched digital cameras before all the consumer electronics companies did. If Google and Facebook (and Amazon, and others…) don’t have drone technology in five to 10 years, they may be as bankrupt as Nokia and Kodak (ironically, Nokia launched mobile phone cameras, which accelerated Kodak’s bankruptcy). Instead, it may be today’s mobile phone and cable television providers who go the way of the landline.

Looking beyond the land of information technology, these examples are powerful illustrations of the fact that we seldom actually know what any new technology is really going to be used for. Even today, we dismiss mobile phone cameras, Facebook, and Twitter as frivolous social tools, but where would Tunisia and Egypt be today without them? Local Motors (client registration required) is just making one-off dune buggies – until GE sees that their microfactories are the future of manufacturing appliances, too. Crowdfunding is just a bunch of kids selling geegaws – until products like the Pebble smartphone beat the Samsung Gear (client registration required), start challenging the now-retreating Nike Fuelband, and even attack the smart home market. Google and Facebook might be saying today that they intend to bring connectivity to new places, even if in reality nobody at all can really say what they’ll do in 2018. While they probably have secret plans, those plans are almost certainly wrong – but better than no plan at all. Companies that plan to survive beyond a few quarterly earnings calls have to make sure they are well positioned to catch whatever falls from new technology’s blue skies.

As VCs Retreat Four New, Nimble Innovation Funding Structures Step In

VC_Article_Graphic_2013

Venture capital is seemingly synonymous with innovation. Venture-backed software companies like Google, Facebook, and Twitter have launched products that billions of consumers use on a daily basis, and their funders have reaped huge dividends.  VC has also catalyzed successful biotech and cleantech companies like Genentech and Solazyme. Thousands of important, successful companies would not exist if it had not been for venture investors providing funding and business acumen to entrepreneurs and inventors.

And that’s why it’s so worrisome that traditional Venture Capitalists are in increasingly obvious retreat. Over the past five years:

  • VCs are doing less: dollars and deal count are down. According to the National Venture Capital Association (NVCA), venture funds peaked in 2011 with $29.7 billion going into 3986 deals. In 2012 they fell to $27 billion, in 3796 deals; and while 2013 is not wrapped, it looks to be a down year again. And VCs are raising less money to invest; through September this year they had brought in $11.6 billion, a plunge from $16.2 billion in the first nine months of 2012. In fact, for 11 of the past 13 years, VCs invested more money than they raised. All these declining numbers point to a long-term shrinkage of VC – meaning less money and mentorship for innovation.
  • VCs are struggling to create value, especially outside software and drugs. Medical biotech and software have long been the dominant categories of VC investment (taking about 15% and 60% respectively). But with global warming fears and oil prices soaring from 2005-2008, many investors briefly branched out into “cleantech.” For example, VC Khosla just put another $50 million into its biofuels maker KiOR (client registration required), which has been on a downward spiral of missed production milestones, producing just 80,000 gallons to date this year (it claimed it would produce more than 3 million gallons by the end of 2013). VCs’ other forays (client registration required) into areas like nanomaterials have fared similarly. Sadly, the only return on many of those investments was to make VCs realize that they don’t understand the science, engineering, and economic constraints on the technology, and even if they did, commercializing it takes too long for VCs to wait.

Fortunately, several alternatives to traditional venture capital are arising to take up the slack. Where will they complement VC, and where will they replace it?

  • Corporate VC invests $5 billion: Corporate VCs like Intel Capital, BASF Ventures, and Monsanto Growth Ventures are large corporations’ way of staying abreast of, and investing in, promising new technologies they find. Last year they invested about $3-5 billion – less than a fourth of conventional VC, but CVCs put it towards areas like industrial and agricultural technology that traditional VCs don’t know how to commercialize.
  • Conscious Capital (aka “impact investment”) grows to $9 billion: As legendary investor Warren Buffett recently argued in a New York Times op/ed article, charity has the potential to better achieve its goals if it adopts more business-minded principles. JP Morgan recently estimated that impact investment will grow by 12.5% this year to $9 billion. And many super-successful entrepreneurs like Jeff Bezos (Amazon), Elon Musk and Peter Thiel (PayPal), and Richard Branson set aside money for pursuing technically audacious goals (client registration required). VCs can’t make such long-term, high-risk bets with their partners’ money, but firms like Bezos Expeditions, Breakout Labs, and the Skoll Foundation can. They are investing in companies like Modern Meadow (client registration required) (which grows meat from cells, lowering the need for both natural resources and animal suffering) and D-Wave (client registration required), the world’s only quantum computer manufacturer.
  • Competitions bring in $2 billion, but have outsized impact: Like business-minded conscience capital, innovation competitions are based on the premise that competing for investment makes the recipients stronger. While high-profile programs like the X Prize and NASA Centennial Challenges are the best-known, the Institute for Competition Sciences, which documents data and best practices in the area, estimates that 30,000 competitions take place worldwide annually. While they are a smaller slice of the overall dollar pie and seldom can fund an innovation entirely, they amplify the value of all other investments into the organization.
  • Crowdfunding bringing $5 billion: Sites like Kickstarter and Indiegogo help small entrepreneurs and inventors to get seed money from thousands of individuals, usually in exchange for the product or merchandise like stickers and t-shirts. Then there are “pure science” crowdfunding sites like Microryza, FundaGeek, and Petridish.org which seek to support experiments and research that may or may not have a tangible return to the donor. Crowdfunding brought innovators some $1.5 billion in 2011, $3 billion in 2012, and will hit $5 billion this year. As with other sources, Crowdfunding’s biggest benefit is not the money – the fundraising campaign brings publicity, customer input, and community-building all at once.

Venture capital is slowly shrinking, while the four new forms of funding – Corporate VC, Competitions, Conscious Capital, and Crowdfunding – are set to pass $20 billion in aggregate, and are growing, fast. In fact, it seems inevitable that they will surpass VC in the coming year or soon thereafter. It’s important to keep in mind that these new forms of funding can both complement traditional venture investment, as well as compete with it by offering better terms inventors and entrepreneurs. Whether they are competing or collaborating, innovation can only benefit from these novel approaches.