Volume LXII, Issue 5

In This Issue

Verizon Extends Virtual Networking Link to AWS

Excerpted from Light Reading Report

As enterprises and government agencies increasingly deploy cloud-based solutions, they look for performance and security in their cloud-based applications.

With Verizon’s Virtual Network Services (VNS) on the Amazon Web Services (AWS) cloud, customers get the end-to-end visibility and the control needed to effectively manage mobile-to-cloud transactions.

Verizon’s VNS service on AWS complements Verizon’s vision and investment in the software-defined networking (SDN) and network functions virtualization (NFV) ecosystem by helping to enable global enterprise customers to securely connect, deploy and manage virtual networks.

Available immediately, this offering is designed to provide AWS users with services to help control network and security policy, using SDN-based technology, from the enterprise edge directly into their Amazon Virtual Private Cloud (Amazon VPC) instance.

“With this enhancement, Verizon will help enterprise and government organizations to confidently implement mission critical solutions in the cloud,” said Shawn Hakl, Vice President of Networking and Innovation, Verizon… Read More

NetApp Has Better Storage Trends

Excerpted from Barron’s Report by Tiernan Ray

Chokshi raises his price target to $56 from $46 for NetApp, writing that it can achieve a 25% operating profit margin, in five years from now, up from 17% at the moment.

Its newer products for cloud, writes Chokshi, are giving NetApp newfound “differentiation.”

NetApp has developed a portfolio of hybrid cloud data management products that include NetApp Private Storage (released in late 2012), ONTAP Cloud (launched in late 2014), Altavault (acquired from FFIV in 2015), CloudSync and Cloud Control (both released in late 2016).

These products form an effective complete hybrid cloud data management capability that spans mission critical workload bursting from a private to a public cloud (NPS) to the more basic disaster recovery (AltaVault) capability.

On our annual Silicon Valley bus tour, CEO George Kurian reiterated from the Analyst Day that the company has 1,500 hybrid cloud customers (we estimate out of ~60K customers) that are utilizing NetApp Private Storage (NPS) and thousands of customers that are utilizing one of NetApp’s hybrid cloud products with the customer base growing triple digits… Read More

Intel’s New Data Center Storage Design

Excerpted from CloudTech Report by James Bourne

It is not quite available yet – but Intel has shed some light on its plans in the data center storage space with the announcement of a new form factor which could enable up to one petabyte of storage in a 1U rack unit.

The new ‘ruler’ form factor, named as such for self-evident reasons, “shifts storage from the legacy 2.5 inch and 3.5 inch form factors that follow traditional hard disk drives” and “delivers on the promise of non-volatile storage technologies to eliminate constraints on shape and size,” in Intel’s words.

The company adds that the product will come to market ‘in the near future.’

1U rackmounts are predominantly 19″ wide and 1.75″ high, although the depth can vary from 17.7″ to 21.5.”

As the numbers go up, the height essentially doubles, so a 5U mount can be 19.1″ by 8.75″ by 26.4″, while 7U, the highest, is 17″ by 12.2″ by 19.8.”

To put one petabyte into perspective, it is enough storage to hold 300,000 HD movies.

Intel also had room for a couple more announcements… Read More

Report from DCIA CEO Marty Lafferty

The inaugural National Research Platform (NRP) Workshop attracted more than one-hundred scientists this week in Montana to plan for work that will impact the future of the internet.

Physicists represent early adopters of big data with projects such as the Large Hadron Collider at CERN generating exabytes – or billions of gigabytes of data – in contrast to say a typical Netflix movie of just a few gigabytes.

The workshop’s purpose was to strategize deployment of interoperable demilitarized zones (DMZs) for scientific research on a national scale in a big data superhighway.

The applicability of this work to other applications, including high-value content deliverability, made the discussions particularly interesting.

Application researchers described their needs for high-speed data transfer, while other discussions focused on domain science requirements and the networking architecture, policies, tools, and security needed to support a national research platform for two hundred organizations.

Academic research requirements are pushing the envelope of computer science and software-driven networks (SDNs) are changing how scientists access computing, manage their data, and analyze their results.

The Scientific Data Analysis at Scale (SciDAS) project at University of North Carolina will integrate multiple tools into an advanced cyberinfrastructure ecosystem to support distributed computing and injection of large data sets and workflows into the computing environment.

The Pacific Research Platform (PRP) meanwhile will improve end-to-end, high-speed networking data transfer capabilities in collaborative, big-data science among institutions, and the National Science Foundation (NSF) requires PRP technologies to be extensible across other scientific domains and to other regional and national networks.

Techniques being discussed at the workshop to support cutting-edge research will soon be filtering into the internet at large. Share wisely, and take care.

Move to a Cloud Career from Traditional IT

Excerpted from InfoWorld Report by David Linthicum

There is a great deal of interest from those with traditional IT skills – such as enterprise architects, developers, and networking engineers – to steer themselves into a cloud computing career that will not only provide job protection, but pay better as well.

However, the path to cloud computing riches is not that clear for most.

The good news: There is a path for many IT pros into the cloud.

This article shows you how to map a path to those jobs from your current state if you are an enterprise architect, database admin, application developer, system admin, test-and-acceptance engineer, or networking engineer.

As an example, the role of an enterprise architect is pretty general in terms of technology and platforms, but companies hiring in anticipation of moving to the cloud are looking for more specific skills.

But look at the career map in the figure below.

There are two very good paths to follow: public cloud solutions architect and cloud security architect… Read More

Computing Architecture for Autonomous Car

Excerpted from Design News Report by Charles Murray

As the autonomous car evolves, automakers face a complex question: How to enable self-driving cars to process massive amounts of data and then come to logical and safe conclusions about it.

Today, most automakers accomplish that with a distributed form of data processing.

That is, they place intelligence at the sensors. More recently, though, that’s begun to change.

Many engineers now favor a more centralized form of data processing, in which simple sensors send raw unprocessed data to a powerful central processor, which does all the “thinking.”

To learn more about distributed and centralized architectures, Design News talked with Davide Santo, an engineering veteran of Motorola and Freescale Semiconductor, and now the Director of the Autonomous Driving Lab for NXP Semiconductors.

Here, Santo offers his views on the topic.

DN: Let’s start with definitions. Could you define distributed and centralized autonomous vehicle architectures for us… Read More

Eliminating Network Blind Spots

Excerpted from GCN Report by Reggie Best

Each year a growing number of critical cyber incidents are discovered in government systems and networks.

Most often, these incidents are reported only after significant damage has been done and critical, secret, or personally identifiable data has been compromised or exfiltrated.

In addition, there has been a significant rise in ransomware attacks, as evidenced by this year’s highly public examples, WannaCry and NotPetya.

And as the number of attacks increases, so does their sophistication, making it difficult to ensure networks are properly secured while still providing availability to critical data and systems.

It’s a challenging balance for government agencies, but protecting networks, systems, and information while continually providing essential services to the public is achievable.

But while this balance can be struck, it’s important to consider the myriad security threats facing government networks and to remember that they contain highly confidential, sensitive or proprietary information.

As government organizations increasingly move to the cloud, their networks become more complicated and vulnerable… Read More

Enabling the Multiverse of Mini-Internets

Excerpted from ComputerWorld Report by Alan Carlton

I am a believer that Device Virtualization vision will manifest fully one day, but we have a long road to travel to this end game. T

The related notion of Service Endpoint Agent (SEA) devices that I have written about is simply a continuation of emerging edge and fog computing trends that will ultimately drive the transformation of wireless networks into flexible, new, value-added service offering environments where said services are literally just one hop away.

However, if we want to realize this vision,it will require that the edge transforms into a different type of mini-internet that behaves just a little differently than the wider internet that it connects to.

Today, there is too often a tendency in marketing literature to show the edge as a simple “box” that is collocated with some form of radio access units.

The reality is obviously slightly more complicated than this.

These so-called edge boxes actually form part of a plurality of Layer 2 transport networks that typically connect said radio units to a point of interconnect with the Layer 3 internet… Read More

Understanding Chaos Theory

Excerpted from Cloud Computing Expo Report by William Schmarzo

We all probably remember the movie “Jurassic Park,” even if we don’t remember this exact scene: Dr. Malcolm, played by Jeff Goldblum is explaining Chaos Theory to Dr. Ellie Sattler, played by Laura Dern.

Dr. Malcolm is explaining how random, seemingly negligible events can disrupt even the most carefully laid out plans.

Dr. Ian Malcolm: after the T-Rex failed to appear for the tour group. “You see a Tyrannosaur doesn’t follow a set pattern or park schedules, the essence of chaos.”

And then later in the movie:

Dr. Ian Malcolm: “Oh, yeah. Oooh, ahhh, that’s how it always starts. Then later there’s running and um, screaming.”

Yes, running and screaming. That’s what happens when even the most carefully developed plans eventually succumb to the compounding of all these “random, seemingly negligible” events.

And understanding the ramifications of Chaos Theory is becoming even more relevant as we look to machines to take over increasingly complex tasks… Read More

Are Organizations Safe From Cyber Attacks?

Excerpted from Report International Business Times Report by AJ Dellinger

Despite two major malware attacks that affected hundreds of thousands of computer systems in countries all over the world, a majority of information security experts still believe organizations are lacking in protections needed to prevent being hit by another attack, a survey found.

Threat detection firm Tripwire surveyed 108 security professionals at the Black Hat USA hacker conference held in Las Vegas in July.

It found a considerable number of experts who were dismayed by the response of organizations in the wake of attacks like WannaCry and Petya.

Two-thirds of all respondents — 68 percent — said, despite the considerable scare of global malware attacks, they did not feel confident enterprises on the whole had made the necessary investments to improve security and protect against a future outbreak.

The news wasn’t all bad, though. An even larger majority of respondents – 84 percent – said the organizations they worked for had made investments that would help to mitigate cybersecurity risks and defend against attacks… Read More

Dynamic Approach to Hybrid Cloud

Excerpted from CIOReview Report

Cloud computing has become one of the most innovative technologies of the decade.

It simplifies data storage and provides the convenience of access from any location for real-time data, a benefit that is reaped by governments, enterprises, SMBs, as well as individuals.

It is a quite established fact that the majority of the cloud users want access to all the data stored in the cloud without having to lose its privacy.

Hence hybrid cloud’s popularity skyrocketed in the recent years as it incorporates the features of third party public and the on premise private cloud services with the cloud computing environment.

Hybridization of cloud services is gaining momentum due to gaining popularity of shift to digitalization within the business.

A hybrid cloud is the best leverage to initiate cloud computing for beginners and reduces risks of data loss.

It helps companies decide the design of the future cloud with the public to handle workloads… Read More

Best Hybrid Cloud Vendors

Excerpted from CBR Report by James Nunns

Hybrid cloud is the future, so it’s best to know which vendor will be right for your business.

The vast majority of analyst research points to the future of cloud computing being a hybrid one.

Public cloud will exist, private cloud will exist, but for most businesses, their cloud environment will be a mix.

There’s a vast mix of reasons why hybrid cloud is a better choice than being all in private or all in the public cloud – regulatory reasons, financial restraints, data privacy concerns, data location, and so on.

The largest tech vendors in the world have figured out, some sooner than others, that hybrid cloud is the future.

And, unsurprisingly, they’ve all taken steps to offer this to their customers.

CBR has put together a list of the best options in the market to help you decide what vendor is going to be the right one for your hybrid cloud deployment.

Microsoft has the size and scale to support large scale public cloud and large scale private cloud deployments, thanks to its massive data center footprint… Read More

Edge Computing: Future of the Internet

Excerpted from e27 Report by Kevin McSpadden

For tech entrepreneurs and investors, one of the most valuable insights in the business is understanding where the infrastructure is heading.

Building an internet company in 2012 without integrating cloud-computing would have been a giant missed opportunity.

In the digital economy, how we live, work and play is constantly evolving and unlike previous telecommunication inventions – like the television – the change is consistently dramatic.

Over the past 10 years, the internet infrastructure has basically transitioned from Black & White to Color over, and over, and over again.

Five years ago, cloud computing was ‘the future’ and even today we talk about the cloud as if it is still a nascent technology – which, to a certain degree, it is.

Cloud companies get so much funding because in modern societies, entire companies run on the cloud – and not just their tech infrastructure but also payroll, financials and internal communications.

But, the cloud is not the future and, thankfully for us, the future has already been invented… Read More

What to Know Before Deploying Edge

Excerpted from InfoWorld Report by David Linthicum

I explained edge computing back in May, and how it’s related to cloud computing.

But I continue to get questions on the use of edge computing, especially on whether should enterprises begin to use edge computing anytime soon.

To make that decision, there are three aspects of edge computing that you should consider:

1. Edge computing is tactical, not strategic.

Edge computing is about putting processing and data near the end points.

This saves the information from being transmitted from the point of consumption, such as a robot on a factory floor, back to centralized computing platforms, such as a public cloud.

The core benefit of edge computing is to reduce latency, and as a result increase performance of the complete system, end to end.

Moreover, it lets you respond to some data points more quickly, such as shutting down a jet engine that’s overheating… Read More

Edge Next Big Thing for Data Processing

Excerpted from TECHYOUnME Report by Pragati Pathrotkar

With technology reaching making gigantic strides every passing day, cloud computing will soon become passé.

Edge computing will be the new paradigm with a majority of processing expected to take place at the device level.

This is based on the sound analysis of where computing is headed, as stated by an executive at venture capital firm Andreessen Horowitz.

This is because of the need for extreme rapid processing for devices such as drones, robots, and autonomous cars as they become more common.

The desired processing speed is much faster, which slows if data is sent to the cloud and wait for a response.

While most companies are still warming up to the idea of going to the cloud, some enterprises are at the point of supplanting cloud computing and move on to the next paradigm.

However, this does not mean cloud will become obsolete and will not have a key place for computing needs.

With the new paradigm, the role of cloud computing is expected to witness a dramatic change… Read More

Global Fog Computing Market Report

Excerpted from QYR Research Press Announcement

The global market for Fog Computing is influenced by a variety of factors, an elaborate assessment of which is covered in the report.

The report on Fog Computing offers in-depth insights into the key market dynamics, notable trends, emerging opportunities, strategic dynamics of major players, and recent technological advancements impacting the growth of the market in various regions.

The study is a reliable source of qualitative and quantitative analysis of current and emerging business risks likely to shape the competitive dynamics.

The evaluation, made with the help of inputs from a wide spectrum of credible secondary sources and various primary sources, offers participants a clear picture of the trajectory of the market over the forecast period 2017 – 2022.

The findings will help stakeholders identify key factors fueling the growth of prominent segments along the forecast period. This report studies the global Fog Computing market, analyzes and researches the Fog Computing development status and forecast in United States, EU, Japan, China, India and Southeast Asia… Read More

Coming Events of Interest

Industry of Things World Europe — September 18th and 19th in Berlin, Germany. Join more than 1,000 high-level executives to rethink your technology and business strategy for scalable, secure, and efficient IoT.

IoT Solutions World Congress — October 3rd through 5th in Barcelona, Spain. This event has grown enormously in no time and is an excellent barometer and source of information, inspiration, collaboration and transformation.

2017 Storage Visions Conference — October 16th in Milpitas, CA. “New Visions for Digital Storage” will bring together the vendors, end-users, researchers, and visionaries who will meet the growing demand for digital storage for all aspects of unstructured and lightly structured data.

INTRASECT— November 1st and 2nd in Washington, DC. The first conference of its kind to engage key stakeholders in a comprehensive and engaging examination of existing and future regulatory policy governing the usage of commercial autonomous vehicles.

Government Video Expo & National Drone Show — November 28th-30th in Washington, DC. The 22nd annual GVE will feature a full exhibit floor with numerous training options, free seminars, keynotes, networking opportunities, and five new educational pavilions.

Delivery of Things World 2018 — April 23rd and 24th in Berlin, Germany. Meet the most influential DevOps practitioners and experts and discuss what DevOps means for your business.

Posted in Newsletters