In This Issue
- Pass Email Privacy Act
- Bolster Cybersecurity
- Next Platform Security
- Report from the CEO
- New CSA Security App
- DHS & Cyber Defenses
- GDPR’s Cloud Biz Opp
- Secure Connected-Car
- NY State Cyber Rules
- Finance Ind & DevOps
- Optimizing Net Design
- IBM Is Fit for VZ Cloud
- Spanner from Google
- AWS Videoconference
- Fog Ref Architecture
- Battle for Web Future
- Coming DCIA Events
With Congress consumed by political battles, there’s still a glimmer of hope for bipartisanship on privacy issues: the House last week passed the Email Privacy Act by unanimous voice vote.
That’s not altogether surprising.
Introduced by Kansas Congressmen Kevin Yoder (R-KS) and Jared Polis (D-CO), the measure was even known as the “most popular bill in Congress” for garnering 315 co-sponsors last year.
Despite its popularity, however, this important bill, which would require government agencies to get a search warrant before accessing emails or other digital communications that are more than 180 days old, failed to pass the Senate last year.
We can’t let that happen again. The Senate must step up to pass this desperately needed legislation that restores the privacy rights of all Americans in the Digital Age.
At a time when internet users are concerned about their privacy, the Email Privacy Act goes a long way toward restoring their confidence and establishing desperately needed online protections… Read More
Almost 20 years ago, Chris Wysopal was among a group of hackers who testified before US Congress, warning it about the dangers of the internet.
Unfortunately, the US government is still struggling to act, he said.
“You’re just going to keep ending up with the status quo,” he said, pointing to the US government’s failure to regulate the tech industry or incentivize any change.
It’s a feeling that was shared by the experts who attended this week’s RSA cybersecurity show.
Clearly, the US government needs to do more on cybersecurity, but what?
Perhaps, the need for US action hasn’t been more urgent.
In last year’s election, Russia was accused of hacking US political groups and figures in an effort to influence the outcome.
In addition, major internet companies, including Yahoo, have also reported huge data breaches, exposing details to a billion user accounts… Read More
As companies face increasing security challenges, they will need to move to a new platform-based security model, top executives from Cisco and Palo Alto Networks said at this week’s 2017 RSA Conference in San Francisco, CA.
This isn’t the same platform security model that is being touted by most major security vendors today – it’s a next evolution of that model, where the focus is on visibility, analysis and enforcement, Palo Alto Networks CEO Mark McLaughlin said in a keynote address Wednesday.
“We are going to see the current security model get turned on its head. I mean that from a business model perspective,” McLaughlin said.
McLaughlin said the current state of security is too complex, too expensive, too slow to adapt to changes and has become increasingly difficult for security professionals to show a return on investment.
He said companies need to adapt a “platform of the future,” which leverages integration and automation, and encourages cross-vendor sharing of threat intelligence.
That “Platform 2.0” will ultimately lead to more innovation, sharing, automation, software capabilities, ease of deployment.. Read More
DevTeamSpace this week blogged about the top ten benefits of cloud computing in 2017.
As proponents and early adopters of storing data, accessing programs, and facilitating communications through the internet, DCIA members and DCINFO readers may appreciate the following summary.
A first benefit of cloud computing is the flexibility it offers to growing new businesses – from start-ups to line extensions.
Cloud capacity can swiftly be increased to provide more bandwidth from remote servers without massive capital investments.
In second place are automatic software updates, eliminating the need to perform such tasks as updating drivers on company computers.
Cloud computing services roll-out regular updates – including security – enabling internal IT resources to focus on higher priority business tasks.
Disaster recovery comes in third because cloud solutions are not only more affordable, but are also superior to most individual enterprise IT departments.
Cloud providers’ relevant expertise and knowledge tend to be more current, and cloud-based backup systems are enormously effective for businesses of any size.
Fourth, internal collaboration: work teams can access programs and share files from any internet connection at any time, increasing visibility and improving productivity.
Cost avoidance ranks fifth thanks to cloud computing running on a subscription-based model.
Sixth is working from anywhere that has an internet connection, enhancing a greater work-life balance with telecommuting.
Business security is the seventh benefit as a result of data being stored in the cloud for access regardless of what happens to any particular machine, and facilitating immediate wiping in the event of a breech.
Eighth is competition, with the cloud enabling smaller businesses to act faster than large companies, even without major financing.
Environmental impacts come in ninth by more closely aligning computing resource usage, energy and carbon footprint with actual need.
And tenth is control of information flow, which increases in both importance and degree of difficulty with the number of participants.
Cloud computing permits instant capturing and disseminating of edits throughout the project completion process. Share wisely, and take care.
The Cloud Security Alliance (CSA), the world’s leading organization dedicated to defining and raising awareness of best practices to help ensure a secure cloud computing environment, today announced the launch of STARWatch, a software-as-a-service (SaaS) application.
Developed by the CSA, the application is designed to provide organizations a centralized way to manage and maintain the integrity of the vendor review and assessment process.
Additionally, STARWatch includes access to more than 200 CSA STAR assessments, helping organizations save time with research and aiding in quicker decision making.
STARWatch delivers the content of the CSA’s de facto standards Cloud Control Matrix (CCM) and CSA’s Consensus Assessments Initiative Questionnaire v3.0.1 (CAIQ) in a database format, enabling users to manage compliance of cloud services with the CSA best practices.
STARWatch is designed to provide cloud users, providers, auditors, security providers assurance and compliance on demand.
During the initial beta phase, CSA achieved tremendous success with more than 250 active licenses activated… Read More
The Department of Homeland Security (DHS) showcased a number of new cybersecurity technologies at the RSA Conference from February 14th to 16th in San Francisco, CA.
Cyberthreats are growing in volume and variety.
This year, the DHS Science and Technology Directorate demonstrated 12 government-funded solutions that are ready for pilot deployment and commercialization.
S&T staff were onsite at RSA to introduce the technology, including the REnigma tool that reverse-engineers malware and ImmuneSoft for detecting and healing vulnerabilities in embedded systems.
Attendees also saw the CHARIOT tool, which filters open-source social media to remove topics that are irrelevant to cybersecurity analysts.
The other solutions ranged from behavior-tracking tools for detecting irregular network traffic to scalable technology for community-based malware defense… Read More
Data privacy experts say it’s not too early for cloud providers to start thinking about how they will comply with new data protection rules in the EU before they go into effect next year.
Businesses have over a year before the EU’s General Data Protection Regulation (GDPR) goes into effect in May 2018, but experts say it’s not too early for cloud providers to start thinking about how they will comply.
In an interview with Talkin’ Cloud, Patrick Lastennet, Interxion’s Director of Marketing and Business Development, says that cloud providers should start evaluating their systems and processes now to ensure they protect data adequately under the new regulation.
Interxion, a European provider of colocation services, is watching the issue closely as its clients turn to the company to provide compliance guidance.
GDPR, which will start being enforced on May 28, 2018, has a broader scope than the current 95/46/CE Directive, and will mean that more companies headquartered outside of the EU will have to comply with European data protection rules.
A study released in July found that European businesses are still fairly unprepared for the new data privacy regulation… Read More
Aiming to beef up security in connected and driverless cars, a group of US lawmakers have introduced a bipartisan bill in the House of Representatives that would put the National Highway Traffic Safety Administration (NHTSA) in charge of studying security issues for cars and trucks that are connected and eventually driverless.
According to a report, Congressmen Joe Wilson (R-SC) and Ted Lieu (D-CA) co-sponsored the bill, dubbed The Security and Privacy in Your Car Study Act.
The goal of the bill is to create a safety standard for connected cars, which are expected to be on the roads in large number by 2020.
The report noted the bill calls on the NHTSA to work with the Defense Department, the Federal Trade Commission, the National Institute of Standards and Technology, the Automotive Information Sharing and Analysis Center, SAE International, academics and automotive manufacturers.
Combined, the different agencies and the private sector would look at how to isolate in-car software to develop a system that can detect and prevent cyberhacks… Read More
New York State on Thursday announced final regulations requiring banks and insurers to meet minimum cybersecurity standards and report breaches to regulators as part of an effort to combat a surge in cybercrime and limit damages to consumers.
The rules, in the works since 2014, followed a series of high-profile data breaches that resulted in losses of hundreds of millions of dollars to US companies, including Target, Home Depot, and Anthem.
They lay out unprecedented requirements on steps financial firms must take to protect their networks and customer data from hackers and disclose cyber events to state regulators.
“These strong, first-in-the-nation protections will help ensure this industry has the necessary safeguards in place” to protect businesses and clients “from the serious economic harm caused by these devastating cyber-crimes,” Governor Andrew Cuomo said in a statement.
The state in December delayed implementation of the rules by two months and loosened some requirements after financial firms complained they were onerous and said they would need more time to comply… Read More
Financial services firms are embracing DevOps approaches and best practices more quickly than other industries, according to new research from managed services provider Claranet.
The study, put together in conjunction with Vanson Bourne and polling 900 end user IT leaders across European mid-market businesses, found almost half (45%) of finance companies polled had already developed a DevOps approach.
This compared favorably against other industries, such as retail, software, and media, for whom the highest figure was only one third (32%).
The report’s findings also indicated financial firms were not done with their implementations; only 12% of those in finance said they were either not planning to implement DevOps or had not made a decision, compared with 25% of the overall sample.
Michel Robert, Claranet UK Managing Director, said other industries should look to finance’s lead.
“Fintech startups are using technology to shake things up in the financial services industry with a customer-centric approach,” said Robert… Read More
Telecommunication experts estimate the amount of data stored “in the cloud” or in remote data centers around the world, will quintuple in the next five years.
Whether it’s streaming video or business’ database content drawn from distant servers, all of this data is – and will continue in the foreseeable future to be – accessed and transmitted by lasers sending pulses of light along long bundles of flexible optical fibers.
Traditionally, the rate information is transmitted does not consider the distance that data must travel, despite the fact that shorter distances can support higher rates.
Yet as the traffic grows in volume and uses increasingly more of the available bandwidth, or capacity to transfer bits of data, researchers have become increasingly aware of some of the limitations of this mode of transmission.
New research from Nokia Bell Labs in Murray Hill, NJ may offer a way to capitalize on this notion and offer improved data transfer rates for cloud computing based traffic.
The results will be presented at the Optical Fiber Communications Conference and Exhibition (OFC), March 19-23 in Los Angeles, CA… Read More
IBM looks best placed to pick up Verizon’s enterprise cloud services business, which the major US operator has decided is now surplus to requirements.
Light Reading reported earlier this month that Verizon has agreed to a deal to sell that business, and there are a number of reasons why it would make sense for IBM to be the buyer.
The acquisition of Verizon’s enterprise cloud services business would make sense for IBM for a number of reasons: Cloud is one of the IT giant’s earnings bright spots and it has a strategic imperative to grow that business — revenues from its cloud services and related activities totaled $13.7 billion for the full year 2016, up an impressive 35%.
Further scale could only help as IBM seeks to gain market share and edge closer to market leader Amazon Web Services (AWS).
IBM previously splashed out about $2 billion in 2013 to acquire SoftLayer for its cloud infrastructure business, and today the company highlights growth in cloud services as a counterpoint to financial declines in its more traditional business units.
The team at IBM will have a very good understanding of what they’d be getting: the former CTO of Verizon’s cloud business, John Considine… Read More
Google today issued a big challenge to its rivals in cloud computing by opening up access to what has been described as the world’s largest database.
The company is launching Cloud Spanner Beta, providing software developers with a database service available through Google Cloud that the search giant already uses to run its massive AdWords advertising system and Google Play app and media store.
Google’s Spanner is a relational database, the kind that stores data in familiar related rows and columns.
What sets it apart from many rivals, including IBM’s DB2, the Oracle Database, Microsoft’s SQL Server and the open-source MySQL, is that it can scale up globally across hundreds of data centers and millions of machines yet act as a single database to keep data consistent in near-real-time.
Essentially, Spanner can handle a ridiculous number of transactions around the world at once, keeping them in order without having to replicate the data in a lot of data centers.
That can be costly and can cause delays in recording the data and providing access to high-velocity applications such as stock transactions… Read More
US online giant Amazon on Tuesday announced the launch of a “unified communications service” which offers video and audio conferencing through its cloud computing service.
The new service called Amazon Chime from Amazon Web Services – which provides the online computing power for thousands of businesses – enables customers to have conversations and videoconferences whether they are using desktop computers, Apple iPhones, or Android devices.
“Most meeting applications or services are hard to use, deliver bad audio and video, require constant switching between multiple tools to do everything they want, and are way too expensive,” said Amazon Vice President Gene Farrell.
“Amazon Chime delivers frustration-free meetings, allowing users to be productive from anywhere.”
“And with no ongoing maintenance or management fees, Amazon Chime is a great choice for companies that are looking for a solution to meetings that their employees will love to use.”
The cloud computing unit of Amazon is among the fastest growing segments for the US tech giant… Read More
The OpenFog Consortium announces the release of the OpenFog Reference Architecture, a universal technical framework designed to enable the data-intensive requirements of the Internet of Things (IoT), 5G and artificial intelligence (AI) applications.
The RA marks a significant first step toward creating the standards necessary to enable high-performance, interoperability, and security in complex digital transactions.
Fog computing is the system-level architecture that brings computing, storage, control, and networking functions closer to the data-producing sources along the cloud-to-thing continuum.
Applicable across industry sectors, fog computing effectively addresses issues related to security, cognition, agility, latency and efficiency.
The OpenFog Consortium was founded over one year ago to accelerate adoption of fog computing through an open, interoperable architecture.
Just as TCP/IP became the standard and universal framework that enabled the Internet to take off… Read More
The W3C, led by Sir Tim Berners-Lee, seems ready to standardize DRM-enabling Encrypted Media Extensions (EME) in browsers, a move that betrays the founding principles of the open Web.
When Berners-Lee invented the Web, he gave it away.
His employer at the time, CERN, licensed the patents royalty-free for anyone to use.
An open architecture that supported the free flow of information for all made it what it is today.
But that openness is under assault, and Berners-Lee’s support for standardizing EME, a browser API that enables DRM (digital rights management) for media playback, has provoked a raging battle within the W3C (World Wide Web Consortium), the organization that sets the standards for how browsers work.
The stakes could not be higher, to hear both sides tell it.
On the one hand, Hollywood is terrified of online piracy, and studios insist that video streaming providers like Netflix use DRM to stop users from pirating movies… Read More
fintech:CODE — March 16th-17th in London, UK. A new international knowledge exchange platform bringing together all DevOps, IT, and IoT stakeholders who play an active role in the finance and tech scene. Key topics include software development, technical challenges for DevOps, DevOps security, cloud technologies and SaaS.
retail:CODE — March 16th-17th in London, UK. 20 real-life case studies, state-of-the-art keynotes, and interactive World Café sessions, 35+ influential speakers will share their knowledge at the intersection of the retail and technology sectors.
Delivery of Things World — April 24th and 25th in Berlin, Germany. Over 400 IT executives will discuss what DevOps really means for business. This event brings together all stakeholders to share their experience and expertise.
Security of Things World — June 12th and 13th in Berlin, Germany. A world class event focused on the next information security revolution. Security concerns that preoccupy enterprise customers today and pragmatic solutions to threats.
Autonomous Systems World — June 14th and 15th in Berlin, Germany. An international knowledge exchange among top experts in the field, providing a unique glimpse into the fascinating world of autonomous robots, intelligent machines, and smart technologies.
INTRASECT — June 29th and 30th in Washington, DC. The first conference of its kind to engage key stakeholders in a comprehensive and engaging examination of existing and future regulatory policy governing the usage of commercial autonomous vehicles.