In This Issue
- Trump/Net Neutrality
- McCain: Cyber Comm
- Cognitive Hack Book
- Report from the CEO
- CS & Online Shopping
- Analytics and Storage
- Mobile Edge Compute
- Services as Software
- Flash Storage at HPC
- Behold the Intercloud
- What’s Best for Apps?
- Four Stages of IIoT
- Cloud Price Discounts
- Oracle Acquires Dyn
- G&B Decentralized IT
- Help Mobile Workforce
- Coming DCIA Events
President-elect Trump has made it clear that he is hostile towards the FCC’s implementation of net neutrality, both in his own words and, today, with the appointment of two long-time adversaries of the policy to his transition team.
Jeffrey Eisenach is an economist and government veteran who worked at the FTC in the ’80s; he’s worked for a number of think tanks and research institutes, and has been a vocal opponent of the FCC’s current net neutrality rules.
The NY Times did some excellent reporting on the man back in August, if you’re curious about the possibility of conflict of interest.
Eisenach described net neutrality as “an effort by one set of private interests to enrich itself by using the power of the state to obtain free services from another” in his testimony before the Senate Judiciary Committee in 2014.
He suggested ISPs have no reason to discriminate between services, and they engender innovation rather than stifle it.
Mark Jamison worked on Sprint’s lobbying team in the ’90s, and like Eisenach has done expert consulting work for several organizations. In a recent op-ed, he called net neutrality a “growing miscellany of ex ante regulations… Read More
Senate Armed Services Chairman John McCain (R-AZ) said Saturday that he does not believe Russian hackers affected the presidential election, adding that he plans to recommend a new select committee on cybersecurity.
“I do not think that the outcome of the election was impacted by Russian hacking. I think the American people have the good sense to vote for people without the help of Russian hackers,” McCain said at the Halifax International Security Forum.
“But I do believe it’s important for us to have — and I will be recommending — a select committee on the whole issue of cybersecurity.”
McCain went on to say that cybersecurity “crossed jurisdictional lines” between committees like Armed Services and Intelligence, and the issue is too important to leave without a specific home.
McCain noted that cyber is the only aspect in which the US military is not leaps and bounds ahead of other nations.
“When Admiral Mike Rogers, the head of Cyber Command, testifies before our committee and says, ‘I don’t know what I don’t know,’ that, my friends, is one of the more disturbing things you could ever hear… Read More
Christopher Skroupa:What is the thesis of your book Cognitive Hack: The New Battleground in Cybersecurity–The Human Mind and how does it fit in with recent events in cyber security?
James Bone:Cognitive Hack follows two rising narrative arcs in cyber warfare: the rise of the “hacker” as an industry and the “cyber paradox,” namely why billions spent on cybersecurity fail to make us safe.
The backstory of the two narratives reveal a number of contradictions about cybersecurity, as well as how surprisingly simple it is for hackers to bypass defenses.
The cyber battleground has shifted from an attack on hard assets to a much softer target: the human mind.
If human behavior is the new and last “weakest link” in the cybersecurity armor, is it possible to build cognitive defenses at the intersection of human-machine interactions?
The answer is yes, but the change that is needed requires a new way of thinking about security, data governance and strategy. The two arcs meet at the crossroads of data intelligence… Read More
In proposing his national cybersecurity plan in October, US President-elect Trump said, “As a deterrent against attacks on our critical resources, the United States must possess the unquestioned capacity to launch crippling cyber counterattacks.”
“This is the warfare of the future, and America’s dominance in this arena must be unquestioned,” he added.
This week, Trump’s cabinet appointment of Michael Flynn, who has been a vocal critic of the nation’s cyber capabilities, as US National Security Advisor, marks a step forward in implementing such a cybersecurity plan comprised of offensive as well as defensive components.
With a strong background in military intelligence, including experience in operations in Afghanistan and Iraq, Flynn previously served as Director of the Defense Intelligence Agency, and after retiring with the rank of General, founded Flynn Intel Group, a cyber-threat prevention consultancy.
Speaking last year on geopolitical hot zones, he said, “We have competitors out there who are rapidly catching up with us.”
“It is stunning how often nation states such as China, Russia, or Iran, or transnational criminal organizations, attack our networks,” he added in an interview last December.
“It’s something we are frankly not prepared for.”
“We cannot win playing on one side of the playing field, on the defensive end,” Flynn said.
The US needs a technological edge over the enemy.
Flynn also discussed overhauling the way the US approaches rapidly changing cybersecurity, including with the addition of a spokesperson role to explain complex cyber matters to the public.
He also supports Trump’s call for a public-and-private-sector task force to meet frequently to discuss new legislation and policies on technology.
“There has to be some means to speed-up how the government functions in this world,” he said. Share wisely, and take care.
With many retailers offering internet-only promotions to go along with their in-store doorbusters, more Americans than ever seem to be choosing to stay home to take advantage of the best deals of the season.
Research from Visa projects an 18 percent increase in online holiday spending this year, which follows 16 percent growth over the 2015 season from the year before.
That uptick in 2015 resulted in about $11 billion of online sales over the five-day Thanksgiving weekend period (Thanksgiving Day through Cyber Monday).
That’s why it’s essential that shoppers protect themselves and their personal information more than ever in 2016.
Especially since “25 percent of all security breaches are taking place in the retail sector,” said Experts Exchange COO Gene Richardson
As a former head of the data security teams of IBM, Charles Schwab and Motorola, Richardson has extensive experience advising companies and consumers alike on how to avoid fraud and protect their identities online… Read More
US-based organizations are budgeting $1.77M for cloud spending in 2017 compared to $1.30M for non-US-based organizations.
10% of enterprises with over 1,000 employees are projecting they will spend $10M or more on cloud computing apps and platforms throughout this year.
Organizations are using multiple cloud models to meet their business’s needs, including private (62%), public (60%), and hybrid (26%).
By 2018 the typical IT department will have the minority of their apps and platforms (40%) residing in on-premise systems.
These and many other insights are from IDG’s Enterprise Cloud Computing Survey, 2016.
The study’s methodology is based on interviews with respondents who are reporting they are involved with cloud planning and management across their organizations… Read More
Gaining traction from a variety of trends, including network virtualization and the internet of things (IoT), mobile edge computing provides traditional data center compute and storage functions at the edge of the network.
As a result, it can reduce application response times for lower latency, increase reliability and improve wireless network security.
Mobile edge computing is based on the principle that moving compute and storage capabilities to the edge of the network improves application performance for end users and devices.
It provides an alternative to the centralized data center model, which sends data to and from centralized data center facilities. However, mobile edge computing is still in the early stages of development and predeployment.
The goal of mobile edge computing is to process data closer to the original application source, thereby speeding response times to the end user.
It sends data to the data center only when necessary, reducing the amount of data traffic on the network and reducing latency… Read More
In just a few decades, the world of data-driven decisions has gone through a significant transformation.
The underlying driver for this is accelerating change in the business environment.
Business models are changing, new competitors disrupt existing businesses and the Fortune 500 list changes every year as former leaders bite the dust.
We see an evolution of four distinct stages of how large businesses have approached problem solving using data.
Before technology, there were people – lawyers, accountants, designers and other specialists who could help businesses understand their businesses better and make decisions to help them create new products and services, restructure through lean times and to power and sustain growth.
During the 90s, computing went through a step change from centralized mainframes to distributed computing.
We saw new solutions for decision support – specific software to address specific problems – pricing, CRM, etc., to Swiss Army knife-like software-like spreadsheets and statistical packages… Read More
Flash is present in just over 90 percent of the HPC data centers belonging to the 143 IT professionals who DataDirect Networks (DDN) quizzed for the latest edition of its High Performance Computing Trends survey.
The big data storage solutions provider also found that 80 percent were using hybrid flash arrays (flash and hard disk storage) for caching operations or to accelerate metadata or data sets.
Despite the blazing performance provided by all-flash arrays, only 10 percent said they employ this type of storage system.
Cloud adoption is rising among national research laboratories, oil producers, life sciences firms and other organizations that place big demands on their IT environments.
Thirty-seven percent said they plan to use the cloud to fulfill at least some of their data storage requirements in 2017, a 10-percent jump compared to last year.
Those cloud adopters are also largely avoiding public clouds for their HPC storage needs, opting instead for private and hybrid cloud solutions (80 percent), said Laura Shepard, Senior Director of Marketing at DDN… Read More
When I first wrote about cloud computing way back in 2008 [see IEEE Spectrum, August 2008], there was a gee-whiz aura surrounding this relatively new way of storing our data and provisioning computing resources.
Now, more than eight years later, cloud computing is just another humdrum piece of technology.
For proof, you need look no further than the latest version of Gartner’s famous Hype Cycle for Emerging Technologies, which no longer includes an entry for “Cloud Computing,” a sure sign of mainstream acceptance.
But network engineers have not been idle—they’ve been busy inventing new subsets of cloud computing.
These new subtypes must have new names, of course, and so the lexicon of cloud computing has changed quite a bit over the last eight years.
There’s a growing field of cloud-related inventions with names that play on their “cloud” origins.
For example, fog computing refers to data storage, applications, processing, and other computing services delivered from nearby devices rather than from a remote data center (after all, fog is what we call clouds that come close enough to touch)… Read More
The proliferation of intelligence into things and devices is rapidly transforming the way enterprises collect and derive value from data.
Complementing this is ubiquitous connectivity, which is producing more network-enabled things than ever before, connecting machines, systems, people, and the environment with control systems and IT on a platform known as the Internet of Things (IoT).
These IoT devices and “things” generate petabytes of information every day and challenge enterprises to rethink their methods of managing, storing, and extracting insight from the Big Data of the IoT or Big Analog Data.
Let’s look at the increasing number of connected devices.
According to Gartner, 6.4 billion IoT devices will be in use in 2016, a 30% increase from the previous year. And with 5.5 million new things connecting each day, the IoT universe is expected to reach 20.8 billion devices by 2020.
Massive volumes of data produced by the IoT contain real-time insight. I like to portion this insight into three categories… Read More
I see every day, we’re surrounded by more and more intelligent devices and “things” – from cars and home monitoring systems to assembly lines, computers, and smartphones.
These are all evolving into the Internet of Things (IoT), which has recently achieved celebrity status on par with earlier dot-coms, Big Data, and the cloud.
Comprising troves of valuable datasets, the IoT is challenging organizations across all industries to tackle compute-intensive applications and tremendous workloads in order to remain competitive.
According to a recent report by MarketsandMarkets, the demand for hardware, software, and platform solutions and services specifically designed to support IoT solutions is expected to grow from $130 billion in 2015 to $883 billion in 2020, at a compound annual growth rate (CAGR) of 32.4% between 2016 and 2022.
Leveraging the proliferation of intelligent devices is giving businesses a choice: innovate, or be left behind.
Increasing connectivity and adoption of HPC at the edge are helping enterprises shift compute functions out of the data center… Read More
Not so long ago, the discussion of whether businesses should use Amazon, Microsoft, or Google data centers instead of their own, centered mostly around pricing.
The conventional wisdom was that cloud computing equaled inexpensive while using internal data centers was pricey.
It was a very simplistic – some would say inaccurate – description.
But with the big three cloud providers cutting prices of their computing power seemingly every other day, the perception was reinforced.
What was lost in the debate was that while computing and storage prices often fell, the price of other cloud services including networking and things like workflow and data analytics did not.
And, the price of moving data out of a given cloud is downright breathtaking, not to mention time-consuming, for a company.
So price is a factor – but not the only factor in the cloud computing decision, a notion that new Deutsche Bank research published Monday seems to bear out… Read More
Oracle announced today it signed an agreement to acquire Dyn, the cloud-based internet performance and DNS provider.
The news comes one month after Manchester, NH-based Dyn reported a major Distributed-Denial-of-Service (DDoS) attack that crippled major websites including Twitter, Reddit, GitHub, Amazon, Netflix, Spotify and even its own.
Company officials did not disclose financial terms of the deal.
“Oracle already offers enterprise-class (Infrastructure-as-as-Service) IaaS and (Platform-as-a-Service) PaaS for companies building and running internet applications and cloud services,” Thomas Kurian, president of product development for Redwood City, CA-based Oracle, said in a statement.
“Dyn’s immensely scalable and global DNS is a critical core component and a natural extension to our cloud computing platform.”
Dyn’s solution drives 40 billion traffic optimization decisions daily for more than 3,500 enterprise customers.
The acquisition will allow Oracle cloud customers to “optimize infrastructure costs, maximize application and website-driven revenue… Read More
It’s a discussion that has rumbled on for years; does a centralized or decentralized model of IT work best?
According to new research commissioned by VMware, cloud computing has become key to the decentralization of IT – and while it may ease the pressure in some areas, security, management, and compliance issues remain.
The latest release, titled State of the Cloud, represents the second half of VMware’s study, with the former, The IT Archipelago, being conducted by the Economist Intelligence Unit and the latter by research firm Vanson Bourne.
According to the 3,300 respondents across 20 countries, more than two thirds (69%) said that the management of IT has continued to decentralize over the past three years, while a similar number (65) of IT-based respondents said they want IT to be more centralized.
Three quarters (74%) say IT should be responsible for enabling other lines of business to drive innovation.
On the flip side, more than half (57%) agree that decentralization has resulted in non-secure IT solutions being purchased, while 56% said it means a lack of regulatory compliance and data protection… Read More
In November 2015, a series of coordinated terrorist attacks left the city of Paris in shock – killing 130 people and injuring more than 350.
In the last twelve months, the increasing regularity of critical events worldwide has seen similar attacks occur in Brussels, Nice, and New York.
As the world becomes more uncertain, the need for organizations to be able to instantly locate and alert employees of nearby risks increases.
The effect of mass globalization on business has seen a rise in the popularity of mobile working.
The latest report by Strategy Analytics indicates the global mobile workforce is expected to grow to more than 1.75 billion by 2020, accounting for almost half of the entire global workforce.
This means more employees than ever will be regularly traveling between locations, often to different cities and countries, or working remotely. As a consequence, keeping mobile workers safe from harm is rising up the corporate agenda.
Traveling employees and the wider mobile workforce face a range of risks that could impact on safety and security.
These threats are not limited to acts of terror but include: fires; natural disasters; flooding and building closures… Read More
Government Video Expo — December 6th-8th in Washington, DC. GVE is the East Coast’s largest technology event for broadcast and video professionals, featuring a full exhibit floor, numerous training options, free seminars, keynotes, panel discussions, networking opportunities, and more.
CES 2017 — January 5th-8th in Las Vegas, NV. More than 3,800 exhibiting companies showcasing innovation across 2.4 million net square feet, representing 24 product categories.
Industry of Things World USA — February 20th-21st in San Diego, CA. Global leaders will gather to focus on monetization of the Internet of Things (IoT) in an industrial setting.
fintech:CODE — March 16th-17th in London, UK. A new international knowledge exchange platform bringing together all DevOps, IT, and IoT stakeholders who play an active role in the finance and tech scene. Key topics include software development, technical challenges for DevOps, DevOps security, cloud technologies and SaaS.
retail:CODE — March 16th-17th in London, UK. 20 real-life case studies, state-of-the-art keynotes, and interactive World Café sessions, 35+ influential speakers will share their knowledge at the intersection of the retail and technology sectors.
Delivery of Things World — April 24th and 25th in Berlin, Germany. Over 400 IT executives will discuss what DevOps really means for business. This event brings together all stakeholders to share their experience and expertise.