Zeit, dass sich was dreht

Der gestern Abend bekannt gewordene Datendiebstahl bei Yahoo verdeutlicht einmal mehr, dass Unternehmen ihre Sicherheitsstrategie genau prüfen und an die sich ändernden Herausforderungen anpassen sollten. Ein Kommentar von Christoph Stoica zum Rekordhack bei Yahoo.

68 Millionen geknackte Nutzerkonten beim Cloudspeicher-Dienst Dropbox, 120.000  gestohlene Kundenzugangsdaten bei der Telekom und jetzt der Rekordhack von einer halben Milliarde Nutzerdaten beim Internetdienst Yahoo, dem einstigen Vorzeigeunternehmen der New Economy. Zwischen diesen drei Meldungen lagen noch nicht einmal 8 Wochen und man wird das Gefühl nicht los, dass sich Informationen und Berichte über Datendiebstähle sowohl hinsichtlich der Anzahl aber vor allem auch in Bezug auf die Zahl der geknackten Nutzerkonten inflationär mehren.  Für die Presse sind solche spektakulären Cyberhacks ein gefundenes Fressen und vielleicht gibt es sogar schon manch pfiffiges Wettbüro, das jetzt Wetten annimmt, wie lange es wohl dauern wird, bis auch der aktuelle Rekordhack mit 500.000.000 kompromittierten Nutzerkonten von einem noch größeren Diebstahl übertroffen wird – für die geschädigten Unternehmen hingegen bedeuten solche Angriffe zunächst einmal vor allem einen Imageverlust und der Verlust der Glaubwürdigkeit. Im Falle von Yahoo scheint diese Datenpanne jedoch auch reelle finanzielle Auswirkungen zu haben.

Wie immer bei der Veröffentlichung solcher  Mega-Datenpannen melden sich auch jetzt wieder diejenigen zu Wort,  die mahnend den Zeigefinger heben und sich fragen, wie ein Datendiebstahl solchen Ausmaßes überhaupt möglich ist, und warum dieser so lange anscheinend unentdeckt blieb. Das Wort Fahrlässigkeit wird auch dabei – und das sicherlich teils auch zu Recht –  wieder schnell die Runde machen.  Es ist  schwer vorstellbar, dass gerade die oben genannten Unternehmen, allesamt aus der IT- Branche, grob vorsätzlich und fahrlässig gehandelt haben  in Bezug auf die seinerzeit getroffenen Sicherheitsmaßnahmen.  Bedenken sollte man, dass alle kürzlich veröffentlichen Datendiebstähle auf Netzwerkangriffe zurückgehen, die bereits vor   4 beziehungsweise 2  Jahren im Falle von Yahoo erfolgten.  Damals galt in den Unternehmen noch die Devise „Schützen und Verteidigen“ als ausreichend  für den Schutz der sensiblen Daten, man investierte vor allem in immer ausgefeiltere Firewalls und Antivirensoftware und die Kombination  aus Passwort und Nutzernamen für die Authentifizierung galt als bestmöglicher Kompromiss aus Sicherheit und Nutzbarbarkeit. Doch mit den sich rasant neu entwickelnden Trends wie Cloud Computing und Cloud Services, Social Media, mobiles Internet, BYOD  muss sich auch der Blick auf die IT-Sicherheitsstrategie komplett ändern. Die wachsende technologische Durchdringung und Vernetzung, die damit einhergehende Komplexität der IT-Landschaften, die sich verändernden Formen der geschäftlichen Zusammenarbeit sowie die ‘always on’ Mentalität, sprich zu jeder Zeit und von jedem Ort online erreichbar zu sein, stellt die IT-Abteilungen ganz klar vor neue Herausforderungen. Der klassische Schutz der IT-Netze und Systeme an den Außengrenzen erodiert zunehmend,  denn es gibt keine Grenze mehr zwischen „innerhalb“ und „außerhalb“  des Netzwerkes – das Netzwerk ist heute überall  und der Feind ebenso.

DeYahoo1

Zeit, dass sich was dreht: Von der statischen IT-Sicherheit hin zur dynamischen IT-Sicherheitsstrategie

Unternehmen sind heute angesichts der stetig wachsenden Bedrohungslage was Cyberattacken anbelangt mehr denn je gefordert, ihre Sicherheitsstrategie zu überprüfen und den geänderten Herausforderungen anzupassen. Die technischen Möglichkeiten hierzu stehen – anders als auch vielleicht noch vor 4 Jahren –  beispielsweise mit einem risikobasiertem Zugriffsmanagement bereits zur Verfügung. Eine Analyse der Risiken und die Implementierung einer risikobasierten Zugriffssteuerung auf  der Grundlage von Multi-Faktor-Authentifizierung sollte daher die Basis eines jeden Sicherheitskonzeptes sein. Eine weitere Grundlage für eine umfassende IT-Sicherheit ist ein zentraler Überblick über alle vergebenen Berechtigungen. Die Konzepte werden auf Basis von Attributen, IT- und Geschäftsrollen sowie Richtlinien definiert. Auch die Vereinfachung und Automatisierung von Prozessen zur Rezertifizierung der Zugriffsberechtigungen und die Etablierung von Identity Governance Initiativen gehören dazu.

Fazit:

Zusammenfassend kann man sagen, dass eine komplette Neubewertung des Umgang mit Berechtigungen und Zugriffen erforderlich ist. Für die Verwaltung von Benutzeridentitäten und Zugriffsrechten reicht eine IT-zentrische Form des Identity Management alleine nicht mehr aus. Ging es früher im Wesentlichen darum, die Benutzerverwaltung zu automatisieren und den Datenschutz- und Compliance-Richtlinien zu entsprechen, sínd heute intelligente Verwaltungslösungen gefragt, um die IT-Sicherheit an sich verändernde Anforderungen anzupassen und situativ und in Echtzeit reagieren zu können. Unternehmen, die angesichts sich massiv häufender Datendiebstähle und Cyberattacken , nach wie vor nur auf eine statische Sicherheitsstrategie setzen, handeln fahrlässig – sowohl in Bezug auf die Datensicherheit als auch im Hinblick auf mögliche Image- und Wertverluste ihres Unternehmens.

Christoph

Christoph Stoica

Regional General Manager DACH

Micro Focus

 

Neuer Sicherheitsstandard PCI DSS 3.2. – Die Daumenschrauben für die Finanzindustrie werden angezogen – Teil 2

Mit den neuen Sicherheitsanforderungen hat das PCI Security Standard Council ein klares Zeichen gesetzt, wie sensible Daten von Kreditkarteninhaber zu schützen sind. Den Firmen wurde zwar noch eine Schonfrist für die Umsetzung der neuen Anforderungen bis zum 1. Februar 2018 gewährt wird, die entsprechenden Weichen dafür sollten aber bereits heute gestellt werden. Erfahren Sie, wie eine effektive und starke Authentifizierungs-Stratgie Ihnen hilft, das Passwort-Problem zu lösen und compliant zu bleiben.

Im ersten Teil meines Blogs zum neuen Sicherheitsstandard PCI DSS 3.2 berichtete ich über die geänderten Sicherheitsanforderungen, die den konsequenten Einsatz einer Multi-Faktor-Authentifizierung für Administratoren bei Banken, Händler und alle anderen, die mit Kreditkarten arbeiten, nun zwingend vorschreibt. Auch wenn den Firmen noch eine Schonfrist für die Umsetzung der neuen Anforderungen bis zum 1. Februar 2018 gewährt wird, sollten bereits heute die entsprechenden Weichen dafür gestellt werden. Es gibt eine Vielzahl von Herstellern, die unterschiedliche Multi-Faktor-Authentifizierungsverfahren anbieten, und die Anzahl der Authentifizierungsmethoden wächst rasant weiter. So gab die HSBC Bank in Großbritannien vor kurzem bekannt, dass sie ab Sommer 2016 eine Kombination aus Sprachbiometrie- und Fingerabdruckverfahren für die Authentifizierung beim eBanking für über 15 Millionen Kunden einführen wird.

thumb

Das Ende statischer Passwörter und einfacher Pins… es gibt bessere Lösungen für eine sichere Zukunft

Die Authentifizierung, die den Zugriff auf das eigene Bankkonto ermöglicht, erfolgt dann per Smartphone, Stimmen-ID und Fingerabdruck. Innovative Hard- und Software ermöglicht eine eindeutige Identifizierung der Stimme anhand von mehr als 100 Merkmalen, wie beispielsweise Schnelligkeit, Betonung und Rhythmus – auch im Falle einer Erkältung! Ein anderes interessantes Verfahren, an welchem Micro Focus und Nymi derzeit arbeiten, ist die Authentifizierung über den eigenen Herzschlag. Hierfür legt sich der Nutzer ein Armband an, welches den Herzschlag per EKG auswertet und individuelle Muster erkennt und prüft.

Jedes Unternehmen hat unterschiedliche Anforderungen und Voraussetzungen für die Implementierung solcher MFA-Lösungen, und somit gibt es keine „one-size-fits-all“-Lösung. Unterschiede bestehen vor allem bei der Integrationsfähigkeit mit Remotezugriffsystemen und Cloud-Anwendungen. Wie löst man also das Passwort-Problem am besten?

Eine effektive Authentifizierungs-Strategie

Es gibt drei Kernpunkte, die Unternehmen bei der Planung eines für sie passenden Authentifizierungsverfahren berücksichtigen sollten:

  • Abbildung der Business Policies in modularen Richtlinien – vorhandene Richtlinien sollten wiederverwendbar, aktualisierbar und auch auf mobile Endgeräte erweiterbar sein. Das erleichtert die Verwaltung der Zugriffskontrolle für die IT-Sicherheit, da der Zugriff für das Gerät dann im Falle eines Sicherheitsvorfalls schnell entzogen werden kann.
  • Verbesserte Nutzbarkeit mobiler Plattformen. Einige Legacy-Applikationen verwenden zwar ein Web-Interface, sind jedoch weder für den mobilen Zugriff noch für regelmäßige Aktualisierungen ausgelegt. Die Verwendung von Single-Sign-On (SSO) Mechanismen für native und Web-Applikationen kann hier durchaus hilfreich sein.
  • Flexibler Einsatz unterschiedlichster Authentifizierungsmechanismen für ein angemessenes Gleichgewicht zwischen Sicherheitsanforderungen, betrieblicher Handlungsfähigkeit und Benutzerfreundlichkeit. Das Authentifizierungsverfahren sollte immer genau dem jeweils erforderlichen Schutzniveau anpassbar sein. Unterschiedliche Benutzer oder Situationen erfordern unterschiedliche Authentifizierungen – die verwendete Methode muss sowohl zur Rolle als auch zur Situation des Benutzers passen.

Die Planung eines für sie passenden Multi-Faktor-Authentifizierungsverfahren sollten Unternehmen jedoch nicht nur am Status Quo ihrer Anforderungen ausrichten, der Blick sollte sich auch auf zukünftige Bedürfnisse richten. Zu berücksichtigen sind insbesondere die zentrale Verwaltung und Steuerung von Benutzern und Endpunkten, sowie die TCO, und ob neue Anforderungen wie Cloud Services und Mobile Devices über das gleiche MFA-Produkt ohne weitere Add-on Module abgesichert werden können.

Thomas Hofmann

Systems Engineer – Micro Focus Switzerland

TomHofmann

 

Oh wie schön ist Panama …. wenn es doch nur keine Datenlecks gäbe.

Die Enthüllungsgeschichte zu den Geheimgeschäften der Anwaltskanzlei Mossack Fonseca schlägt nicht nur politisch hohe Wellen. Auch die Frage, inwiefern Sicherheitslücken in den IT-Systemen der Kanzlei das Datenleck erst ermöglichten, beschäftigt die Experten. Wie fahrlässig hat die Anwaltskanzlei in Bezug auf das Thema Data Governance seine IT-Systeme konfiguriert? Welche Risiken gehen Unternehmen ein, die ihre unstrukturierten nicht in ihre Data Governance Strategie einbeziehen? Christoph Stoica beantwortet diese Fragen und zeigt Lösugnen auf ….

Schon bei Janosch sagte der Bär zum kleinen Tiger : „In Panama ist alles viel schöner. Panama ist das Land der Träume“ . So wie der Janosch Bär dachten wohl auch viele andere – Politiker, Firmen, Privatpersonen, Prominente & Sportstars – die Liste lässt sich beliebig lang fortsetzen. Sie alle betreiben in Steueroasen Geheimgeschäfte in bislang ungeahntem Ausmaß, was die jüngsten Recherchen der Süddeutschen Zeitung und dem International Consortium for Investigative Journalists (ICIJ)  eindeutig belegen.

Die Unterlagen einer panamaischen Anwaltskanzlei  zeigen, wie sie in Geschäfte mit Offshore-Konstruktionen verstrickt sind. Das Leck umfasst E-Mails, Urkunden, Kontoauszüge, Passkopien und weitere Dokumente zu rund 214.000 Gesellschaften, vor allem in Panama und auf den Britischen Jungferninseln. Die sogenannten Panama Papers zeigen, wie Netzwerke aus Banken, Anwaltsfirmen und anderen Vermittlern zweifelhafte Vermögen in Steueroasen verstecken.

Wie Sicherheitslücken in den IT-Systemen von Mossack Fonseca bei den Panama Papers halfen

Ungeachtet wie die Medien Zugriff auf die Dokumente erhalten haben, sei es durch einen Insider oder durch einen gezielten Angriff auf die Computersysteme, eines bleibt hierbei unbestritten:  die IT – Systeme entsprachen definitiv nicht den der heutigen Zeit angemessenen Sicherheitsstandards. Man kann sogar noch weiter gehen und behaupten, dass das Unternehmen und insbesondere die IT-Abteilung  im Hinblick auf Datenschutz- und Compliance Bestimmungen grob fahrlässig gehandelt hat. Das Magazin Wired berichtet in einer Onlineausgabe, dass  sowohl Frontend als auch Backend-Systeme nicht geupdated wurden und große Sicherheitslücken aufwiesen – so wurde zum Beispiel das Anmeldeportal für den Kundenbereich sowie das Content  Management System seit 2013 nicht mehr aktualisiert, Outlook Web Access befand sich gar auf dem Stand von 2009. Hinzu kommt, dass die Server einfach schlichtweg falsch konfiguriert waren, so dass ein unerlaubter Einblick in Verzeichnislisten problemlos möglich war. Vor diesem Hintergrund klingt das auf der Webseite gegebene Verspechen der Kanzlei, seinen Kunden stets einen  „sicheren Online-Zugang“ zu gewähren, mit dem sie auf „die Informationen ihrer Firma von überall auf der Welt“ zugreifen können, wie ein Farce.

Data Governance für unstrukturierte Daten wird oft unterschätzt

Spannend neben der Frage, welche Sicherheitslücken wirklich von den Angreifern genutzt wurden,  ist vor allem auch die Struktur der geleakten Daten. Für jede Briefkastenfirma hat sich Mossack Fonseca einen Arbeitsordner angelegt, in dem sich E-Mails, Verträge, Abschriften, eingescannte Dokumente und weitere Schriftstücke, die mit der jeweiligen Offshore-Firma in Verbindung stehen, abgelegt wurden. Das Datenleck umfasst über 11,5 Millionen Dokumente, bestehend aus 4,8 Millionen Datenbank-Dateien, 2,1 Millionen PDFs, 1,1 Millionen Bilder, 320.166 Textdateien und 2.242 andere Dateien. Bedenkt man nun, dass Mossack Fonseca seit über 40 Jahren das Geschäft mit den Briefkastenfirmen betreibt, umso klarer ist es, dass  angesichts der unvorstellbaren Datenmenge, wohl niemand in der Kanzlei mehr so genau wusste, welche Daten sich in den jeweiligen Ordnern befinden, wem sie gehören und als wie sensibel sie einzustufen sind. In der IT spricht man in diesem Falle von sogenannten „Dark Data“ oder „unstrukturierten Daten“, da diese nicht in einer Datenbank oder einer anderen speziellen Datenstruktur abgelegt werden.

PanamaData

Leaktivism entwächst den Kinderschuhen

Die Menge an Daten, die auf Dateiservern und NAS-Geräten, in Kollaborationsportalen, Postfächern und Ordnern oder in der Cloud gespeichert werden, nimmt Jahr für Jahr  explosionsartig und unkontrolliert zu. Dies betrifft nicht nur Kanzleien sondern gilt für Unternehmen nahezu aller Branchen. Ob in Personal- oder Rechtsabteilungen, Geschäftsführungs- und Aufsichtsgremien, Forschungs- und Entwicklungsabteilungen oder Betriebsräten – viele Unternehmensbereiche hantieren tagtäglich mit unstrukturierten Daten. Unternehmen, die unstrukturierte Daten nicht in ihre Data Governance Strategie einbeziehen, gehen immer größere Risiken hinsichtlich Sicherheit, Gesetzeskonformität und Compliance ein. Waren die Beweggründe von Edward Snowdon oder der jetzt unbekannten Quelle im Falle der Panama Papers eher die eines Aktivisten, so können diese Schwachstellen jedoch auch früher oder später von der Cybermafia ausgenutzt werden und Firmen könnten somit erpresst werden. Datenschutzlösungen müssen sensible und kritische Daten unabhängig davon, wo sie sich befinden, schützen. Unternehmen benötigen einen allumfassenden Überblick, welche Daten vorhanden sind, wie sie genutzt werden, wer dafür verantwortlich ist und wer darauf zugreifen kann. Nur so können gesetzliche Vorgaben für Datenzugriff, -nutzung und -aufbewahrung eingehalten und vertrauliche Informationen vor unberechtigter Nutzung und Offenlegung geschützt werden. Konzepte zur automatisierten und revisionssicheren Berechtigungsverwaltung müssen endlich auch auf Dateistrukturen ausgeweitet werden. Die Zugriffskontrolle darf sich nicht länger nur auf Unternehmensapplikationen und Datenbanken beschränken – auch in Verzeichnissystemen muss man sich mit der Frage privilegierter Accounts und der Überwachung sensibler Bereiche beschäftigen. Data Governance lässt sich nicht von einem Tag auf den anderen umsetzen, aber Micro Focus hat Lösungen, die Ihnen helfen, die Transparenz zu verbessern, Risiken zu erkennen und Maßnahmen einzuführen die Ihnen die Kontrolle über Ihre Daten zurückgeben und die Compliance verbessern.

Christoph Stoica

Regional General Manager DACH

Micro Focus

Christoph

Federal Breaches and COBOL – the OPM Hack Explained

Micro Focus Product Marketing Director Ed Airey explains the high profile OPM hack. Was COBOL really to blame?

The U.S. Office of Personnel Management (OPM) recently experienced the largest U.S. governmental data breach potentially exposing the personal data of up to 18 million current and former federal employees exposed. To explain the reason behind the breach, many have pointed the finger at COBOL, the venerable programming language. Critics maintain that because the programming language was written decades ago, attackers were able to find and exploit vulnerabilities into the OPM’s systems.

However, even the strongest army base is at risk when the doors are wide open. Similarly, the security measures and access methods to core government systems and data, as the metaphorical gatekeepers, must be up to the task of protecting the prized possessions inside.

Why the Government, and Many Other Organizations, Use COBOL

People have a tendency to believe that what’s new should be the best solution. It’s time to set the record straight; the most likely candidate for ongoing success in terms of IT capability, are the systems that work today, and have done so for years. So while COBOL isn’t a new concept, it is an unrivalled technology in terms of running core systems.

There is good reason why COBOL has been in active use for core business systems, across many platforms, for five decades. The U.S. Federal Government has billions of lines in COBOL in current use, because these applications are reliable and suit the government’s needs. Without these systems, it would be very difficult for government agencies to deliver on their individual mission.

Outside of the U.S. government, the use of COBOL is even more pervasive with over 200 billion lines of COBOL code across many vital financial insurance industries as well as retail, logistics and manicuring organizations to name a few. In fact, COBOL is responsible for two-thirds of global IT transactions.  COBOL’s longevity is due to its unrivaled ability to adapt to technological change.  Few languages over the past six decades have continually adapted to meet the demands of digital business and modern technology.

Addressing the Real Issues

While data encryption and multi-factor authentication are important security considerations, the broader IT security question is more significant. After all, even if data is encrypted, but poorly secured, attackers can still steal it. So the real question we should ask after a breach is not what programming language an organization was using, but rather what security protocols and measures did the organization employ to prevent unauthorized access in the first place? All applications require robust infrastructure security.  Without it, all systems are at risk, regardless of their age.  Here are a few specific questions any organization should ask before and after a security breach:

  • Does my organization follow proper password best practice, or are passwords too simple?
  • Do our users have the appropriate amount of access, or do some have unnecessary administrative rights?
  • Do we have identity and access management (IAM) processes in place that monitor user activity and alert us of suspicious behavior?

If members of an organization cannot answer these questions confidently, there are security gaps that need addressing immediately. These issues affect peripheral systems—web, client, server and other user interface systems that enable access to back end data. Attackers typically look for these frontend vulnerabilities in order to gain access to the backend applications, systems and data. Poor security practices leave the metaphorical front door open, giving attackers access to the whole house.

In short, whether an organization uses Java or COBOL is irrelevant if the organization’s security protocols and practices are lacking.  This was indeed the case at OPM.  Inspector General McFarland noted in his Capitol Hill testimony that OPM has failed to act on the recommendations of his office to modernize and secure its existing IT infrastructure.  McFarland further commented that such failures were likely the cause of this breach.

OPM1

Modernizing COBOL systems to meet new challenges

COBOL’s proven reliability and longevity are misinterpreted as signs that it has not evolved to support modern IT requirements or is deficient in some other way. U.S. Federal CIO Tony Scott has even suggested that the government needs to “…double down on replacing these legacy  systems.” Replacing COBOL, however, is not the answer and will undoubtedly introduce many more challenges to a government IT organization struggling to presently keep pace with modern tech advances. The smarter move is to innovate from a position of strength; which COBOL provides.

Modern COBOL technology delivers the trusted reliability and robustness that it did in 1960 but with the ability to connect to modern technologies and architectures including cloud, mobile, .NET, and Java, as well as the latest hardware platforms from the z13 mainframe to the latest incarnations of Windows, UNIX and Linux. By supporting and integrating with the latest platforms and digital technologies, IT can rest assured and get on with the business of implementing more pressing concerns such as implementing appropriate security strategies for their evolving systems.

Given the seemingly increasing digital threat our IT systems face, it’s critical that IT leaders provide a more responsive, flexible and integrated management system to secure these mission critical applications from unauthorized use.  Modern COBOL offers simple solution to the OPM security breach and an opportunity to significantly improve its existing security infrastructure.

Ed

 

 

 

 

Orginal Article written by

Ed Airey

Amie Johnson

Derek Britton

Regulation Acceleration

Regulation and Compliance shouldn’t be big news – after all, the IT world has had to conform to rules and regulations for years. Yet it seems every day there’s bad publicity and a hefty fine for yet another major corporation facing a non-compliance ruling. This blog looks again at the challenge that just won’t go away – and asks what we can learn.

I have spoken before about regulatory compliance and the necessity for IT to look to make systematic improvements to how it supports a variety of compliance and regulatory changes. It seems that in 2014 there are no signs of things getting any easier. Let’s take a look of the state of regulation today.

Here is a sample of publicity in the recent months:

The usual suspects

The biggest single cluster of regulatory news affects the financial services industry. Adversely affected by the global economic downturn, financial services organizations have since been the target of stringent new regulatory controls. Unsurprisingly perhaps, news abounds across a variety of “non-compliance” issues in the industry.

In a case of internal compliance, Credit Suisse were recently reported as investigating two of its own dealers for trading rule transgressions. A broader industry issue especially in the UK has been PPI mis-selling. Recently, the UK Financial Conduct Authority (FCA) probe has prompted 2.5 million PPI cases to be reopened. The impact of PPI regulatory measures was reported as a cause of Lloyds’ Bank profit fall.

Meanwhile, other regulations were contravened in high-profile cases. Deutsche Bank was fined over fiscal reporting, while the Royal Bank of Scotland’s mortgage advice irregularities resulted in a fine of £14.5M ($23.7M). Elsewhere, the LIBOR rigging scandal has hit Lloyds to the tune of £218M ($356M), meanwhile Bank of Scotland’s “double-billing” scandal was cited in the lawcourts as “unconscionable” whilst the FCA continues to investigate them.

In terms of notoriety, however, spare a thought for Citi Group – its part in the financial meltdown has resulted in an astonishing $7Bn penalty, as reported in a press release.

Newspaper Headlines

The verdict by industry observers is understandably blunt. Trust in Banks is still “years away”, according to the chairman of the UK Treasury Committee, Andrew Tyrie. Meanwhile in some cases, jittery Fund managers are deserting banking stocks. And there’s no sign of things easing up – regulators are getting more stringent in their measures, while the recent SEPA regulation is being closely followed by an equally exacting new control, FATCA – the Foreign Account Tax Compliance Act, set to go live in 2015.

Not just financial services

Regulatory compliance, and failure thereof, is by no means the exclusive remit of the financial services industry. Electronics giants Philips, Samsung and Infineon were subject to a total of 138m-euro fine over pricing irregularities. Telecoms giant Verizon was fined $7.4M over consumer clarity complaints, while Energy supplier EDF was ordered to pay £3M ($4.9) to support vulnerable customers after failing to manage complaints.

It’s no secret

Data privacy regulations are a hot topic, and most news reported on the topic is bad news for the brands in question. High profile stories surround data privacy breaches have recently hit the headlines at Home Depot, Supervalu and UPS. However the press saved the most column inches for the unfolding Community Health Systems saga, where the data hack is reported to have affected 4.5 million customers.

Emerging from the shadows

What do all these stories have in common? The attribute that links them is that each story has been reported in the last few months. So, a commonplace, recurring theme suggesting a recurring challenge across a variety of industries.

The cost of non-compliance in individual cases might mean specific and often eye-watering fines, while the longer-term operational impact on a variety of industries, not least the financial services sector is untold risk and potentially irreparable brand damage. Coping with this is being taken very seriously – industry publication Banking Technology reports a Bank of England estimate that 70,000 new finance roles will be created in Europe alone to help tackle increasing compliance workload.

T2VBlogforJordan

But headcount is not the only requirement. Throwing more staff at a problem where the processes and supporting technology is outmoded and inefficient is simply more chefs in a tiny kitchen.

Technology needs to be part of the solution.

And it can be. Micro Focus’ approach to IT regulation sees the challenge as a three-pronged issue – find the root of non-compliance, fix the issue, and then validate the change. We refer to this as Find It, Fix It, Test It. This approach leverages the best in technology to help automate and streamline these critical IT change projects, which all too often have unmovable, aggressive timescales. If you need to accelerate your regulatory efforts, we can help.

Core Values – Why We Need to Act Fast

Amid concerns over its ability to provide what the business needs, IT must tackle significant operational challenges to deliver more, and deliver it faster. This blog explores the current IT leadership predicament, and discusses how to streamline complex IT processes by discovering smarter ways of extracting value.

The here and now

IT supports the business and plays a critical role in its performance. Important long-standing core applications provide the fundamental backbone of business operations – and over many years an irreplaceable, comprehensive IT environment has evolved.

However, that doesn’t mean there aren’t concerns. For many, IT budgets remain stagnant, yet the organization has a growth strategy. For most organizations the majority of budget is spent on the day-to-day running of the business, referred to as ‘keeping the lights on’. Why? An array of innovative systems and processes has made the business what it is today. But such innovation has come at a cost – a recent study states around $11M per organization is required to address the backlog. And with more innovation, comes more backlog.

The backlog continues to grow, complexity of the IT environment follows suit, and future agility diminishes. The downward spiral continues. Even armed with the very latest zEnterprise mainframe technology, IT leaders faced unprecedented challenges in continuing to support business growth.

Oh, sounds bad.

Actually, it’s worse than that. There’s an abundance of other challenges to deal with – the forever growing IT backlog, continuous changes in compliance regulations and numerous outsourcing challenges. Additionally, organizations continually must widen and improve their skills pool. Let’s consider each of these concerns:

Consider compliance

Regulatory compliance is a pressing concern for many IT departments, but far too often it gets pushed to the bottom of the list. It takes time, effort and prioritization. And on top of all that, it takes focus away from delivering what really matters back to the business.

Governance, risk and compliance projects are unplanned, non-negotiable IT milestones with far reaching consequences. Meeting regulations with finite IT resources is a challenge that limits the ability to focus on innovation.

Newspaper Headlines

Keeping the lights on – the true costs

Gartner reveal upwards of 70% of an organization’s IT budget is for ‘lights-on’ activities only. This is referred to as ‘dead money’ as it isn’t directly contributing to business growth or enhancing competitive advantage. This figure of 70% is only expected to grow, with CIOs estimating a 29% rise in ‘IT debt’ over the last 18 months.

The high percentage of ‘lights on’ budget means very little is left, and consequently, placement of remaining resources is more critical than ever and ultimately affects the ability to deliver, grow and maintain competitive advantage.

lightson

Outsourcing – a global panacea?  

Application outsourcing accounts for a major proportion of global application maintenance activities. A recent study suggests that 48% of CIOs outsource all testing and development projects.

Implementing the best possible technical infrastructure is a challenge and carries many considerations: Is it cost effective? Will it affect quality? Can more be achieved? Can coherent system integration be achieved? For some, a move towards outsourcing is often associated with a loss of control, hidden costs, security and confidentiality intrusion as well as additional concerns. Meanwhile for many others, it’s a way forward – access to skilled staff, increased operational efficiency and improved flexibility. However, establishing and controlling an effective outsourcing strategy remains a significant operational challenge.

Perceived resourcing concern

As businesses evolve, so must core systems, and critical COBOL applications must do more than ever. Keeping pace with that evolution can be a significant resourcing challenge, as new skill requirements emerge. Outsourcing, as mentioned above, could be an option but it might not be considered the appropriate strategy. Either way, organizations now require a more specific skill set than ever before, which has consequently created questions around development skills.

Timing is everything

There’s no respite in the operational challenges facing IT – these enterprise environments are highly complex, innovation capacity is limited and delivering business value – quickly – is severely compromised. The time to find a way to manage the current, while delivering the new, can’t come soon enough.

Delivering fast enterprise time to value

In a recent BBC report, the UK Banking industry is “puzzled” at productivity levels that remain below those prior to the 2008 financial crisis. From an IT perspective, when one considers the issues above, and the difficultly in delivering against such a kaleidoscope of internal concerns, it may be less surprising than at first glance: poor internal efficiency can only hamper large organizations’ ability to deliver the volume and quality of services those business need.

What if there was a technology that could enable organizations to efficiently tackle the day-to-day operational challenges, freeing up time, and putting the control back in your hands? What if the power of the mainframe estate could be harnessed yet further?

Imagine unrivaled technology that helps tackle the challenges of compliance, IT backlog, outsourcing and skills. Technology that makes the CIO a hero once again – and delivers value back to the business quickly.

T2VBlogforJordan

With Micro Focus there is a way.

The Micro Focus Enterprise Solution leverages the power of the mainframe to further streamline business processes and transform enterprise
application delivery. Using industry-standard technology including Eclipse and zEnterprise, it helps tackle regulatory compliance challenges head-on, identifies and mitigates factors contributing to the backlog, supports outsourcing strategy, and addresses internal application resource concerns. Micro Focus provides solutions for all phases in enterprise application delivery cycle, including improved application intelligence, user access, application change and development, unit and system testing and workload optimization, offering a 50% improvement in application delivery

Learn more!

Watch the introduction video and read ‘The 10 ways to transform time to value’ and ‘Quick Reference’.

Regulatory compliance: the time is now

Micro Focus Product Manager Jordan Ashman investigates Compliance and argues that the time to ‘find it, fix it and test it’ really is now in this revealing blog.

In our recent blog, You have 20 seconds to comply…, we explored the importance of regulatory and legislative compliance as it pertains to IT, and how organizations can stay ahead of the game in the face of mounting industry pressure. In this blog we look closer at how often organizations, of all sizes, in an array of industries, are faced with significant compliance efforts – and we’ll dive into some real examples of where things have gone wrong.

In any IT organization there’s always more than one focus, more than one “must have” priority task – so how important are compliance projects? In many recent cases, by the looks of it, the answer is, alas, not important enough. Recently, compliance (or rather, non-compliance) horror stories, where organizations have fallen foul of a vital regulatory measure, usually with dire reputational consequences, seem to have littered the press. Barclays Chief Executive, Antony Jenkins, said that rebuilding the brand after recent high-profile issues would take between 5 and 10 years, while JP Morgan has employed over 3,000 new employees to prepare for settlements with regulators over recent compliance incidents.

A big problem?

A recent global survey regarding regulation aimed at CFOs depicts just how big of a concern compliance is. Regulation and compliance top the list of challenges bearing most concern, and with further new regulation to be enforced surely the percentage will only increase.  a2

What’s the Risk?

Consider some recent results when compliance isn’t at the top of the ‘to-do’ list. Careless activities have spawned compliance-related fines at an unprecedented scale. JP Morgan set aside $20B in 2013 for compliance related litigation costs. Meanwhile RBS has had to set aside over £3billion to cover claims relating to the latest financial crisis – the mis-selling of mortgage products, PPI claims and interest rate hedging. It seems not a day goes by without a fresh compliance news story hits the tabloids and broadsheets – and same goes for Social Media Channels such as Twitter – which is often a faster, and less forgiving, medium for complaint. A #compliance search on Twitter search gives many hundreds of unique, negative stories.

What’s the Outlook?

What about the coming months. Surely we are through the worst of the effort in terms of meeting industry and legislative rules and regulations? It appears not. Compliance workload looks like it is here to stay – and with proposed amendments to the Data Protection Regulation it’s time to review how compliant your organization is before you’re affected. According to Computer Weekly the proposed amendment will require additional security measures to be implemented by all European businesses that process personal data – companies that do not comply with the proposed regulation of up to 5% of annual worldwide turnover, or €100m. A recent data breach has led to US banks re-issuing over 17 million payment cards – presumably this will call for US compliance regulation to be further tightened in order to avoid such issues occurring again.

Home Rule

Even tackling the major external regulatory requirements is not the end of the story – there are a number of internal considerations IT must also address. Business operations are complex; many organizations outsource functions, frequently introduce new technologies and use 3rd party vendors. Certifying and complying with technical standards, establishing and managing service level agreements, and even as far as internal coding standards, this presents a cornucopia of IT projects and deadlines jostling for position in the list of overall priorities.

The Time is Now

With regulatory compliance efforts on the increase and fixed deadlines to tackle, there still remain many non-compliant organizations failing to meet various standards. Recently, the European Commission proposed an extension to the deadline for European countries to be compliant with the Single European Payments Area (Sepa) of six months in a final warning to laggards.

If the compliance discipline remains largely undisciplined, and yet the industry continues to groan under the strain of greater and greater regulation, then smarter ways must be found in IT to cope with the burden and establish a process to ensure 100% compliance.

Micro Focus’ refreshingly straightforward approach to IT regulation sees the challenge as a three-pronged issue – find the root of the non-compliance, fix the issue, and then validate the change. This Find It, Fix It, Test It approach leverages the best in technology to help automate and streamline these critical IT change projects, which all too often have unmovable, aggressive timescales. It is this approach which can also be used in a whole variety of IT modernization projects across the enterprise.

Learn more about our approach by visiting here. The right time to smarten up your IT compliance process is now.

You have 20 seconds to comply…

I think you’d better do what he says, Mr. Kinney

In the 1987 film Robocop, the enthusiastic compliance of Mr Kinney is ignored by the malfunctioning law-enforcement robot, ED-209, with fatal consequences. Many of the legal imperatives and regulations facing the IT world today are accompanied by an unmovable deadline and threats of punitive measures – the beleaguered IT team could be forgiven for feeling like another short-lived extra in a dystopian sci-fi movie. While the timeframe given typically exceeds 20 seconds , the deadlines are usually aggressive and non-negotiable, making the associated IT change project a high priority “must have” and the budget a “must spend”. Worse still, as evidence of economic and management frailty continues to beset many industries, regulatory bodies have “no shortage of excuses to launch … [a] clampdown”  with further compliance measures.

Comply with this

Compliance feels like a faintly vague term. What are we complying with, and who told us to do so? Here’s how one industry commentator defined compliance: The process of adherence to policies and decisions. Policies can be derived from internal directives, procedures and requirements, or external laws, regulations, standards and agreements. Consider now the following selection of regulatory or legislative changes that have emerged in the last decade or so:

compliancetable

Of course, this isn’t a one-off task. Many of these regulations require compliance not only in the first instance, but then also as part of an on-going audit and reporting process. So the “compliance work” is an annual event to build into the IT schedule. Add to this little list the efforts undertaken on in-house regulations including coding guidelines, standards adherence, code complexity criteria; it is no wonder that these efforts are forcing an unprecendented demand in IT simply to ‘keep the lights on’.

Getting Ahead of the Game
An untold variety of technical complexity awaits the intrepid IT team seeking to conform and comply with the latest round of regulations. However, such measures have a couple of key things in common:

• Core application code will need to change
• How data is stored will (usually) have to change
Irrespective of the new regulation or measure, extra elements that comprise the new activities need to be wired into the applications that currently provide that business function. It therefore holds that a fundamental approach – a lifecycle for change – is needed for IT teams to follow in order to plan and execute an effective compliance project:

1. Find it: Uncovering the breadth and depth of the required IT change
2. Fix it: Executing the change program as efficiently as possible
3. Test it: Establishing a full change validation process, incorporating data privacy needs

This lifecycle closely resembles the Software Development Lifecycle (SDLC) – unsurprisingly perhaps because there is, at the heart of both things, a major change to a core application required.

Find Your Best Practice

Having supported organizations find, fix and validate their large scale change programs since the days of Y2K and the Euro conversion, Micro Focus has provided an efficient, scientific and rapid solution for a variety of compliance activities. Whether the concern is determining the scope of the required changes, executing the application change effort itself, or establishing a secure and streamlined validation process, Micro Focus enables organizations to meet aggressive deadlines with greater confidence, enabled through smart technology.

Look out for the Micro Focus Compliance program.

[1] Harry Wilson, Daily Telegraph, 8th October 2012, “Bleak Future for the Banks of Tomorrow”

[1] http://blogs.gartner.com/paul-proctor/2013/05/13/why-i-hate-the-term-grc/

 

 

 

 

Dealing with IT Debt – The Lights are on (Part 2)

Introduction

My recent blog discussed the topic of “keeping the lights on”[1] in IT terms. Important day-to-day activities including routine maintenance, system enhancements and important compliance and governance activities amounted to a significant operational workload.  The discussion elicited a number of questions on the topic, from a variety of sources. In this blog, I want to tackle those questions here.

Lights On – is it getting any easier?

Alas, no. Typically, Lights On activities are reviewed, managed and prioritized using the concept of a “Backlog”, an IT to-do list. The backlog is an indication of the volume of outstanding must-do requirements. More interestingly, a measurement of the cost of implementing the backlog throws into sharp relief the extent of the issue. Gartner estimates that within the next 12-18 months, the global IT backlog, in dollars terms, will have grown from $500M (in 2010) to reach $1 Trillion. This is what the industry calls IT Debt[2].

Why are things getting worse?

The growing burden of IT may be for a variety of reasons, which will vary on the industry, organization and existing IT estate.

We don’t know where to start – IT estates are often vast, but application experts are often too busy to provide their perspective. However, system documentation and knowledge is usually woefully insufficient to help less experienced staff get “up to speed”. A study by Vanson Bourne[3] revealed that nearly half of all organizations had no process for assessing or measuring IT Debt, despite technology existing to cater for this[4].

IT gets more complex every year – IT estates of anything but the smallest organization are complex beasts, typically with insufficient understanding of all the relationships and interdependencies. Knowing which systems to change, and how, and in what order, is anything but straightforward.

We have more updates to do – the systems of 2013 are supporting more users, through more channels, for more hours of the day, with more variety, than ever before. All of this needs supporting in the back end.

We have more compliance to do – The notion of the “large scale compliance project” took hold for the Y2K and the Euro, then towards Sarbanes-Oxley, HIPAA, Solvency II and Basel II/III but has accelerated more recently in the wake of the economic crisis. Regulations and legislation on data privacy, banking standards, customer information and international taxation have added to the workload. We’ll look at Compliance in more detail in upcoming blogs.

We don’t have the (right) tools – many core systems are mainframe COBOL-based, which use mainframe technology for analysis, development, testing and deployment. Some of these tools have struggled to keep up with technological advances and provide only limited scope to improve efficiency. As years pass, the ability for the tools to support and keep up with the pace of change and new requirements has become steadily worse.

Tackling these concerns would be central to any solution to the IT Debt challenge.

Why not just rip it up and buy one that works already?

On a recent Micro Focus webinar on the topic of IT Modernization, an attendee asked “why not just remove the difficult systems and replace with a package?” It is certainly tempting, faced with an overwhelming backlog, to consider “starting over”.  However there are a number of extremely important considerations affecting core systems strategy:

As significant as the “lights on” burden may be, the core systems being maintained continue, in the vast majority of cases, to provide unique capability for the organization. Often, this capability is part of its competitive edge. Replacing such systems with an off-the-shelf solution runs a major risk of losing any competitive advantage.

Additionally, while a “package” may seem at face value to be a simple, low-risk undertaking, the implementation of a package that works for the client’s own unique requirements is far from simple – Standish Group’s Chaos Report[5] mentions a 40% rate of failure where new packages were delivered late, over-budget or missing key functionality.

A Micro Focus client undertook their own study looking at the relative merits of package replacement compared with continued evolution (modernization) – according to the four major decision criteria of cost, risk, time to market and competitive advantage, package replacements were considered inferior to the application modernization approach, which they chose.

Why not keep the function, but rewrite it in a new language?

Where organizations have expanded their IT footprint and now possess a wider range of technical and language skills, it is tempting to consider writing from scratch the core systems in a different language. Such an approach may appear appropriate in terms of skills and technical strategy, but again caveats would need to apply.

First, in rewrite projects, it is extremely difficult to “phase in” replacement functionality until the entire system is ready. This means waiting until the project is finally completed and delivered before retiring the incumbent system. Often, this is a long wait.

The risk of failure for such projects, according to the same Standish Report, runs at over 70%. The figure is high because the complexity and effort of such projects is seldom fully understood at the outset.

Furthermore, it is worth pointing out that the full cost of such a project is not at the point of delivery, it is the life-cycle of that system. An illuminating report by CAST Software discovered that systems written in COBOL are up to 4 times cheaper to maintain than an equivalent system written in Java. Determining an ROI for a rewrite raises a number of very important questions about the viability of such an approach.

Why not just do what we are doing today – but better?

The problem with existing systems is the amount of new work organizations still need to do. In reality, that is not a problem to do with the existing system at all; it is a problem to do with how busy an organization is – this points at resource or organizational challenges rather than purely technological ones.

In many cases, while not without their challenges, core systems work. They embody and enable the day-to-day running of the business, and continue to provide value and support revenue generation. They are, as the name suggests, core to the success of the organization.

A modernization approach nurtures and evolves those core systems, but does this in a way that is efficient, streamlined and forward-thinking, which helps protect existing value while supporting innovation.

The process of application modernization is shaped by the key challenges facing the client, as this can dictate the start point for the journey. Whether the issue is to do with knowledge of the systems, the ability to execute change, the pace of delivery, challenges with testing, or even the effort of managing production workload, there are ways of improving processes and supporting technology to help reduce the backlog and support new initiatives. Micro Focus has worked with hundreds of clients to help them evolve their core systems by tackling difficult operational challenges while protecting core IT assets.

But COBOL? Really? It’s 2013!

COBOL is often regarded by those who don’t know it well as great in its day but a generation out of date. One measurement of this is how current the language is: COBOL provides equivalent capability and support in the latest industry IDEs as well as supporting the most recent platforms including .NET, JVM, Cloud and Mobile[6]. Moreover, the fact that COBOL is behind many core systems and runs the vast majority of all global business transactions rarely raises an eyebrow. COBOL also has seen something of resurgence in terms of academic training[7], and in September re-joined the top 20 most popular languages as measured by the TIOBE index[8].  The notion that COBOL is out of date is itself an outdated idea.

Modernizing trusted core COBOL systems can help address IT Debt. Micro Focus offers technology to enable organizations to streamline their lights on activities. Is your IT backlog continuing to grow, despite significant investment? Do you face significant barriers to modernization? Perhaps it is time to take a fresh look. www.microfocus.com

The lights are on, but no-one’s home……..

As up-to-date as IT systems may be, they must evolve constantly to remain current and support the evolution of the business. Routine maintenance, either in terms of bug fixing and updates for ‘home built’ systems, or patches/updates from the vendor, are a fact of life for core IT systems. Regular updates to hardware, storage and communications also comprise a necessary activity in IT.

lightsonAccording to the Industry Lights-on IT Activities Cost a Lot – But Where is all the Money Going?

Introduction: How’s it going?

In a study published in InformationWeek in October 2012, results of a survey into the reputation of IT as a source of innovation were published. The most striking headline statistic was that in many cases (57%) IT thinks it’s doing ok (as in that it’s “agile, flexible”), yet the business (29%) doesn’t agree.

Why the difference of opinion? One major factor is that, in many situations, IT cannot afford to do everything it needs to. The overworked IT executive team must explain that the majority of time, budget and resource is spent simply ‘Keeping the lights on’. We’re told that the majority of resource is needed just to keep the wheels turning.

Costly confession

What does ‘Lights On’ actually mean? “In a line-item budget, lights-on is a descriptor for expenditures that are absolutely necessary for maintaining a company’s critical business operations. Lights-on differentiates a ‘need’ from a ‘want.’” Gartner Group states that 70% of IT budgets are spent on lights-on activities. Forrester Research suggests this number may be as high as 80%. At accounts Micro Focus has spoken with, that ‘lights-on’ percentage was placed as higher still – in some instances up to 90%.

However, these activities are not keeping pace with the amount of required change. According to Gartner, the overall cost of what it calls ‘IT debt’ is projected to reach $1 trillion by 2015.

Counting the cost

Faced with an OPEX of perhaps tens of millions of dollars, many IT leaders have wanted to learn more, to find ways to reduce that burden. So, what does lights-on actually mean in terms of time, effort and dollars spent? There are three major categories that typically comprise ‘lights-on’:

  • The routine maintenance of applications and systems
  • The application improvement or enhancement backlog
  • Regulatory and Legislative Compliance

Let’s look at these in turn.

Maintenance

As up-to-date as IT systems may be, they must evolve constantly to remain current and support the evolution of the business. Routine maintenance, either in terms of bug fixing and updates for ‘home built’ systems, or patches/updates from the vendor, are a fact of life for core IT systems. Regular updates to hardware, storage and communications also comprise a necessary activity in IT.

Such activities will naturally require time and effort from IT staff, as well as incurring maintenance or leasing costs to hardware, software and related suppliers. Typically, maintenance activities are not fully understood by other teams. Maintenance tasks include correcting errors, keeping up with new platforms and third party software integration. It’s not just bug fixing.

Enhancements

Maintenance activities may be important for keeping systems up-to-date. But, more often than not, business stakeholders or user groups may be demanding more dramatic or visible improvements to IT systems in support of business changes. New markets, new customers, new levels of performance or new channels may be best obtained by leveraging existing systems, tailored or enhanced beyond their current remit to serve a broader purpose.

However, planning and implementing such incremental improvements is often seen as part and parcel of the IT operating expense. Fundamentally, these activities are also seen as part of the overall cost of the lifecycle of that application or system. As such, ‘enhancement’ work may simply be seen as a specific sub-category of maintenance[i].

Compliance

Since the Enron scandal and in the recent period of economic austerity, the public has borne witness to unprecedented scrutiny into industrial-scale problems. Examples are banking and retail system outages, LIBOR rigging, PPI mis-selling, Insider Trading, Overseas Corporate Tax avoidance, leading to widespread public and governmental criticism of industry bodies and major organizations. This scrutiny, together with new legislative changes, has resulted in an array of compliance measures being introduced.

Compliance

For legislative, industry regulatory or procedural requirements, the task falls on IT to implement the changes within its core systems to support the legislative requirement. Organizations may look at implementation ahead of specific deadlines as a level of readiness, even before legislation is passed. Either way, the imposition of new regulation includes prescribed timescales, often requiring IT organizations to “find room” to add this new project work.

Whether global (ISO27002, Basel II/III), International (FACTA 2013 International Tax Compliance), Continental (SEPA, Euro), country specific (e.g. PCI-DSS, FISMA, Joint Healthcare Commission, HIPAA and SarbOx in the USA), or industry-specific (e.g. Basel, Solvecy II and countless others in Financial Services), the number of new compliance measures to support has continued to evolve and grow over time.

Conclusion

Irrespective of the composition of lights-on activities, the biggest concern is the overall operating expenditure devoted towards them. Simply, this is not seen as moving the business forward. Tech-savvy consumers are demanding cloud, mobile and new IT architecture and this new generation of customer is forcing organizations to look hard at their strategy.

IT is under continued pressure to do more, but the burden of the existing day-to-day workload has never been greater, and continues to grow. IT leaders need to look towards smart ways to combine innovation and lights-on projects to stand any chance of accelerating delivery of both sets of requirements.



[i] Lientz, B.P. and E.B. Swanson, Software Maintenance Management, Addison-Wesley Longman, 1980.