February 20, 2017
Volume LX, Issue 3
Click Here to Subscribe
In This Issue
- Fog Computing Market
- Serverless Computing
- Mobile Edge Sways NP
- Report from the CEO
- New CSA Security App
- DHS & Cyber Defenses
- GDPR’s Cloud Biz Opp
- Secure Connected-Car
- NY State Cyber Rules
- Finance Ind & DevOps
- Optimizing Net Design
- IBM Is Fit for VZ Cloud
- Spanner from Google
- AWS Videoconference
- Fog Ref Architecture
- Battle for Web Future
- Coming DCIA Events
Orbis Research’s Global Fog Computing Market 2017-2021 enhances decision-making capabilities and helps create effective counter-strategies to gain competitive advantage.
The internet of things (IoT) and cloud computing are the key enablers of the fog computing market.
IoT is a network of physical devices, sensors, and machines integrated into all objects connected through the internet for effective data communication.
It creates smart communication environments such as smart homes, shopping, transportation, and healthcare.
IoT enhances operational efficiency by increasing the speed of communication over the existing infrastructure, which in turn, improves business productivity in all industrial setups.
It focuses on improving the process capabilities by enabling real-time business decisions with data storage and computing capacity at the basic sensor level.
The report covers the present scenario and the growth prospects of the global fog computing market for 2017-2021… Read More
Cloud computing is the new normal, businesses no longer wish to run or maintain their infrastructure and the cloud’s elastic capability means they only pay for what they need.
As cloud adoption and trust grows, so does the breadth of services available for consumption.
There are well-known services such as Amazon’s Elastic Compute Cloud (EC2) for compute, Simple Storage Service (S3) for storage and the eponymous Amazon Relational Database Service.
Microsoft has its equivalents to these services in Azure.
Some solutions offered through the cloud are decidedly more niche – the Elastic Transcoder for transcoding videos on demand for example and there are a huge number of SaaS solutions that can replace many common business applications.
IaaS is probably the most flexible way of running compute in the cloud.
With complete control over your virtual machine, you can install exactly the software you require and set up the virtual machine precisely how you wish… Read More
Along with XONA Partners, Mobile Experts presents compelling new analysis for Mobile Edge Computing.
This report provides a clear explanation of how edge computing impacts performance for the mobile network.
This study describes the technology of multi-access edge computing (MEC) and shows variations defined by the ETSI MEC ISG, OpenFog Consortium, Open Edge Computing (OEC), and includes architectures such as cloudlets that open up the business model.
The report highlights the features of MEC that will enable each application: latency benefits, backhaul benefits, and more.
“Edge Computing is a companion technology for 5G, and will be an important part of creating monetization applications which open up new sources of revenue for operators,” commented Principal Analyst Joe Madden.
Mobile Experts provides clear guidance to show how MEC enables the applications, but the report also covers the technical requirements that MEC imposes on the case for mobile operators is explored in order to illustrate the opportunities and threats inherent in MEC for a 4G/5G operator… Read More
But as a result of increasing internet connectivity and intelligence, remote-controlled and autonomous watercraft will soon be added to the list of disrupted transportation systems, and are expected to be safer, more efficient, and cheaper to run.
Advances in sensors, communications, and computing are enabling ships and boats to join the ongoing revolution in vehicular autonomy already ranging from cars to planes, helicopters, and trains.
Thanks to internet of things (IoT) projects on a grand scale taking place in Europe and Asia, robotic watercraft are now on the horizon to become a reality.
The Advanced Autonomous Waterborne Applications (AAWA) project co-sponsored by Rolls Royce has the objective of developing the technology for such a vessel capable of operating in coastal waters by 2020.
Researchers at DNV GL, an international ship-certification organization, are exploring the feasibility of using autonomous battery-powered vessels to transport freight along Norway’s long coastline.
The European Union (EU) has contracted the Fraunhofer Center for Maritime Logistics and Services for its Maritime Unmanned Navigation through Intelligence in Networks (MUNIN) to assess the technical, economic, and legal feasibility of operating an un-crewed merchant ship autonomously during an open-sea voyage.
And China’s Maritime Safety Administration and Wuhan University of Technology have partnered in the Un-crewed Multifunctional Maritime Ships Research and Development (UMMSR&D) to find ways for autonomous ships to be used within China’s own commercial and military maritime sectors.
The Sweden-based Safety and Regulations for European Unmanned Maritime Systems (SARUMS) and UK-based Maritime Autonomous Systems Regulatory Working Group simultaneously are addressing required regulatory changes, which, as is often the case, are more of a gating factor than the related technological advancements.
Their aim is to ensure that the next iteration of the International Convention on Safety of Life at Sea (ICSLS), which governs international shipping, will accommodate these technologies.
Autonomous vessels can be lighter, sleeker, more fuel efficient, more piracy resistant, and with more cargo space than manned ships.
Intelligent ships will also make maritime careers more attractive by eliminating lengthy on-board duty assignments, transferring technical jobs to ports-of-call or land-based operations.
Share wisely, and take care.
The Cloud Security Alliance (CSA), the world’s leading organization dedicated to defining and raising awareness of best practices to help ensure a secure cloud computing environment, today announced the launch of STARWatch, a software-as-a-service (SaaS) application.
Developed by the CSA, the application is designed to provide organizations a centralized way to manage and maintain the integrity of the vendor review and assessment process.
Additionally, STARWatch includes access to more than 200 CSA STAR assessments, helping organizations save time with research and aiding in quicker decision making.
STARWatch delivers the content of the CSA’s de facto standards Cloud Control Matrix (CCM) and CSA’s Consensus Assessments Initiative Questionnaire v3.0.1 (CAIQ) in a database format, enabling users to manage compliance of cloud services with the CSA best practices.
STARWatch is designed to provide cloud users, providers, auditors, security providers assurance and compliance on demand.
During the initial beta phase, CSA achieved tremendous success with more than 250 active licenses activated… Read More
The Department of Homeland Security (DHS) showcased a number of new cybersecurity technologies at the RSA Conference from February 14th to 16th in San Francisco, CA.
Cyberthreats are growing in volume and variety.
This year, the DHS Science and Technology Directorate demonstrated 12 government-funded solutions that are ready for pilot deployment and commercialization.
S&T staff were onsite at RSA to introduce the technology, including the REnigma tool that reverse-engineers malware and ImmuneSoft for detecting and healing vulnerabilities in embedded systems.
Attendees also saw the CHARIOT tool, which filters open-source social media to remove topics that are irrelevant to cybersecurity analysts.
The other solutions ranged from behavior-tracking tools for detecting irregular network traffic to scalable technology for community-based malware defense… Read More
Data privacy experts say it’s not too early for cloud providers to start thinking about how they will comply with new data protection rules in the EU before they go into effect next year.
Businesses have over a year before the EU’s General Data Protection Regulation (GDPR) goes into effect in May 2018, but experts say it’s not too early for cloud providers to start thinking about how they will comply.
In an interview with Talkin’ Cloud, Patrick Lastennet, Interxion’s Director of Marketing and Business Development, says that cloud providers should start evaluating their systems and processes now to ensure they protect data adequately under the new regulation.
Interxion, a European provider of colocation services, is watching the issue closely as its clients turn to the company to provide compliance guidance.
GDPR, which will start being enforced on May 28, 2018, has a broader scope than the current 95/46/CE Directive, and will mean that more companies headquartered outside of the EU will have to comply with European data protection rules.
A study released in July found that European businesses are still fairly unprepared for the new data privacy regulation… Read More
Aiming to beef up security in connected and driverless cars, a group of US lawmakers have introduced a bipartisan bill in the House of Representatives that would put the National Highway Traffic Safety Administration (NHTSA) in charge of studying security issues for cars and trucks that are connected and eventually driverless.
According to a report, Congressmen Joe Wilson (R-SC) and Ted Lieu (D-CA) co-sponsored the bill, dubbed The Security and Privacy in Your Car Study Act.
The goal of the bill is to create a safety standard for connected cars, which are expected to be on the roads in large number by 2020.
The report noted the bill calls on the NHTSA to work with the Defense Department, the Federal Trade Commission, the National Institute of Standards and Technology, the Automotive Information Sharing and Analysis Center, SAE International, academics and automotive manufacturers.
Combined, the different agencies and the private sector would look at how to isolate in-car software to develop a system that can detect and prevent cyberhacks… Read More
New York State on Thursday announced final regulations requiring banks and insurers to meet minimum cybersecurity standards and report breaches to regulators as part of an effort to combat a surge in cybercrime and limit damages to consumers.
The rules, in the works since 2014, followed a series of high-profile data breaches that resulted in losses of hundreds of millions of dollars to US companies, including Target, Home Depot, and Anthem.
They lay out unprecedented requirements on steps financial firms must take to protect their networks and customer data from hackers and disclose cyber events to state regulators.
“These strong, first-in-the-nation protections will help ensure this industry has the necessary safeguards in place” to protect businesses and clients “from the serious economic harm caused by these devastating cyber-crimes,” Governor Andrew Cuomo said in a statement.
The state in December delayed implementation of the rules by two months and loosened some requirements after financial firms complained they were onerous and said they would need more time to comply… Read More
Financial services firms are embracing DevOps approaches and best practices more quickly than other industries, according to new research from managed services provider Claranet.
The study, put together in conjunction with Vanson Bourne and polling 900 end user IT leaders across European mid-market businesses, found almost half (45%) of finance companies polled had already developed a DevOps approach.
This compared favorably against other industries, such as retail, software, and media, for whom the highest figure was only one third (32%).
The report’s findings also indicated financial firms were not done with their implementations; only 12% of those in finance said they were either not planning to implement DevOps or had not made a decision, compared with 25% of the overall sample.
Michel Robert, Claranet UK Managing Director, said other industries should look to finance’s lead.
“Fintech startups are using technology to shake things up in the financial services industry with a customer-centric approach,” said Robert… Read More
Telecommunication experts estimate the amount of data stored “in the cloud” or in remote data centers around the world, will quintuple in the next five years.
Whether it’s streaming video or business’ database content drawn from distant servers, all of this data is – and will continue in the foreseeable future to be – accessed and transmitted by lasers sending pulses of light along long bundles of flexible optical fibers.
Traditionally, the rate information is transmitted does not consider the distance that data must travel, despite the fact that shorter distances can support higher rates.
Yet as the traffic grows in volume and uses increasingly more of the available bandwidth, or capacity to transfer bits of data, researchers have become increasingly aware of some of the limitations of this mode of transmission.
New research from Nokia Bell Labs in Murray Hill, NJ may offer a way to capitalize on this notion and offer improved data transfer rates for cloud computing based traffic.
The results will be presented at the Optical Fiber Communications Conference and Exhibition (OFC), March 19-23 in Los Angeles, CA… Read More
IBM looks best placed to pick up Verizon’s enterprise cloud services business, which the major US operator has decided is now surplus to requirements.
The acquisition of Verizon’s enterprise cloud services business would make sense for IBM for a number of reasons: Cloud is one of the IT giant’s earnings bright spots and it has a strategic imperative to grow that business — revenues from its cloud services and related activities totaled .7 billion for the full year 2016, up an impressive 35%.
Further scale could only help as IBM seeks to gain market share and edge closer to market leader Amazon Web Services (AWS).
IBM previously splashed out about billion in 2013 to acquire SoftLayer for its cloud infrastructure business, and today the company highlights growth in cloud services as a counterpoint to financial declines in its more traditional business units.
The team at IBM will have a very good understanding of what they’d be getting: the former CTO of Verizon’s cloud business, John Considine… Read More
Google today issued a big challenge to its rivals in cloud computing by opening up access to what has been described as the world’s largest database.
The company is launching Cloud Spanner Beta, providing software developers with a database service available through Google Cloud that the search giant already uses to run its massive AdWords advertising system and Google Play app and media store.
Google’s Spanner is a relational database, the kind that stores data in familiar related rows and columns.
What sets it apart from many rivals, including IBM’s DB2, the Oracle Database, Microsoft’s SQL Server and the open-source MySQL, is that it can scale up globally across hundreds of data centers and millions of machines yet act as a single database to keep data consistent in near-real-time.
Essentially, Spanner can handle a ridiculous number of transactions around the world at once, keeping them in order without having to replicate the data in a lot of data centers.
That can be costly and can cause delays in recording the data and providing access to high-velocity applications such as stock transactions… Read More
US online giant Amazon on Tuesday announced the launch of a “unified communications service” which offers video and audio conferencing through its cloud computing service.
The new service called Amazon Chime from Amazon Web Services – which provides the online computing power for thousands of businesses – enables customers to have conversations and videoconferences whether they are using desktop computers, Apple iPhones, or Android devices.
“Most meeting applications or services are hard to use, deliver bad audio and video, require constant switching between multiple tools to do everything they want, and are way too expensive,” said Amazon Vice President Gene Farrell.
“Amazon Chime delivers frustration-free meetings, allowing users to be productive from anywhere.”
“And with no ongoing maintenance or management fees, Amazon Chime is a great choice for companies that are looking for a solution to meetings that their employees will love to use.”
The cloud computing unit of Amazon is among the fastest growing segments for the US tech giant… Read More
The OpenFog Consortium announces the release of the OpenFog Reference Architecture, a universal technical framework designed to enable the data-intensive requirements of the Internet of Things (IoT), 5G and artificial intelligence (AI) applications.
The RA marks a significant first step toward creating the standards necessary to enable high-performance, interoperability, and security in complex digital transactions.
Fog computing is the system-level architecture that brings computing, storage, control, and networking functions closer to the data-producing sources along the cloud-to-thing continuum.
Applicable across industry sectors, fog computing effectively addresses issues related to security, cognition, agility, latency and efficiency.
The OpenFog Consortium was founded over one year ago to accelerate adoption of fog computing through an open, interoperable architecture.
Just as TCP/IP became the standard and universal framework that enabled the Internet to take off… Read More
The W3C, led by Sir Tim Berners-Lee, seems ready to standardize DRM-enabling Encrypted Media Extensions (EME) in browsers, a move that betrays the founding principles of the open Web.
When Berners-Lee invented the Web, he gave it away.
His employer at the time, CERN, licensed the patents royalty-free for anyone to use.
An open architecture that supported the free flow of information for all made it what it is today.
But that openness is under assault, and Berners-Lee’s support for standardizing EME, a browser API that enables DRM (digital rights management) for media playback, has provoked a raging battle within the W3C (World Wide Web Consortium), the organization that sets the standards for how browsers work.
The stakes could not be higher, to hear both sides tell it.
On the one hand, Hollywood is terrified of online piracy, and studios insist that video streaming providers like Netflix use DRM to stop users from pirating movies… Read More
Industry of Things World USA — February 20th-21st in San Diego, CA. Global leaders will gather to focus on monetization of the Internet of Things (IoT) in an industrial setting.
fintech:CODE — March 16th-17th in London, UK. A new international knowledge exchange platform bringing together all DevOps, IT, and IoT stakeholders who play an active role in the finance and tech scene. Key topics include software development, technical challenges for DevOps, DevOps security, cloud technologies and SaaS.
retail:CODE — March 16th-17th in London, UK. 20 real-life case studies, state-of-the-art keynotes, and interactive World Café sessions, 35+ influential speakers will share their knowledge at the intersection of the retail and technology sectors.
Delivery of Things World — April 24th and 25th in Berlin, Germany. Over 400 IT executives will discuss what DevOps really means for business. This event brings together all stakeholders to share their experience and expertise.
Security of Things World — June 12th and 13th in Berlin, Germany. A world class event focused on the next information security revolution. Security concerns that preoccupy enterprise customers today and pragmatic solutions to threats.
Autonomous Systems World — June 14th and 15th in Berlin, Germany. An international knowledge exchange among top experts in the field, providing a unique glimpse into the fascinating world of autonomous robots, intelligent machines, and smart technologies.
INTRASECT — June 29th and 30th in Washington, DC. The first conference of its kind to engage key stakeholders in a comprehensive and engaging examination of existing and future regulatory policy governing the usage of commercial autonomous vehicles.