The true cost of free

There always exists the low-cost vendor who offers something for free to win market share. In enterprise IT, it is worth examining what free really means. Derek Britton goes in search of a genuine bargain

Introduction

IT leaders want to help accelerate business growth by implementing technology to deliver value quickly. They usually stipulate in the same breath the need for value for money. The pursuit of the good value purchase is endless. No wonder then that vendors who offer “use our product for free” often get some attention. This blog looks at the true cost of ‘free’.

Measuring Value

We all use desktop or mobile apps which, if they stopped working – and let’s face it, they do from time to time – wouldn’t really matter to us. We would mutter something, roll our eyes, and re-start the app. That’s not to say that people aren’t annoyed if they’ve not saved some important work when their application stops, but typically the impact is nothing more than a briefly disgruntled user.

But if an application is doing something critical or stategically important for an organization, then it is higher up on value scale. For example, an ATM application, savings account, package or logistics, money transfer, credit check, insurance quote, travel booking, retail transaction.  What if it went wrong? What if you also needed it to run elsewhere? What value would you put on that? Vitally, what would happen to the organization if you couldn’t do those things?

valuequal

Get it for free

Application Development tooling and processes tend to incur a charge, as the link between the technology and the valuable application is easily determined. However, there is required additional technology to deploy and run the built applications. Here, the enticement of a “free” product is very tempting at this stage. After all, why should anyone pay to run an application that’s already been built? Many technology markets have commoditised to the point where the relative price has fallen significantly. Inevitably, some vendors are trying the “free” route to win market share.

But for enterprise-class systems, one has to consider the level of service being provided with a “free” product. Here’s what you can expect.

Deployment for free typically offers no responsibility if something goes wrong with that production system. Therefore internal IT teams must be prepared to respond to applications not working, or find an alternative means of insuring against that risk.

A free product means, inevitably, no revenue is generated by the vendor. Which means reinvestment in future innovations or customer requirements is squeezed. As an example, choice of platform may be limited, or some 3rd party software support or certification. Soon enough an enticing free product starts to look unfit for purpose due to missing capability, or missing platform support.

Another typical area of exposure is customer support, which is likely to be thin on the ground because there is insufficient funding for the emergency assistance provided by a customer support team.

In a nutshell, if the business relies on robust, core applications, what would happen if something goes wrong with a free product?

An Open and Shut Case?

Consider Open Source and UNIX. In a time when UNIX was a collection of vendor-specific variants, all tied to machinery (AIX, Solaris, HP/UX, Unixware/SCO), there was no true “open” version for UNIX, there was no standard. The stage was set for someone to break the mould. Linus Torvalds created a new, open source operating system kernel. Free to the world, many different people have contributed to it, technology hobbyists, college students, even major corporations.  Linux today represents a triumph of transparency, and Linux, and Open Source is here to stay.

However, that’s not the whole story. It still needed someone to recognize the market for a commercial service around this new environment. Without the support service offered by SUSE, Red Hat and others, Linux would not be the success it is today.

Today, major global organizations use Linux for core business systems. Linux now outsells other UNIX variants by some distance. Why? Not just because it was free or open source, but because the valuable service it provided organizations with was good value. But people opt to pay for additional support because their organizations must be able to rectify any problems, which is where organizations such as SUSE and Red Hat come in. Linus Torvalds was the father of the idea, but SUSE, Red Hat (and their competitors) made it a viable commercial technology.

Genuine return

Robust, valuable core applications will require certain characteristics to mitigate any risk of failure. Such risks will be unacceptable for higher-value core systems. Of course, many such systems are COBOL-based. Such criteria might include:

  • Access to a dedicated team of experts to resolve and prioritize any issues those systems encounter
  • Choice of platform – to be able to run applications wherever they are needed
  • Support for the IT environment today and in the future – certification against key 3rd party technology
  • A high-performance, robust and scalable deployment product, capable of supporting large-scale enterprise COBOL systems

The Price is Right

Robust and resilient applications are the lifeblood of the organization. With 4 decades of experience and thousands of customers, Micro Focus provides an award-winning 24/7 support service. We invest over $50M each year in our COBOL and related product research and development. You won’t find a more robust deployment environment for COBOL anywhere.

But cheap alternatives exist. The question one must pose, therefore, is what does free really cost? When core applications are meant to work around your business needs – not the other way around, any compromise on capability, functionality or support introduces risk to the business.

Micro Focus’ deployment technology ensures that business critical COBOL applications that must not fail work whenever and wherever needed, and will continue to work in the future;  and that if something ever goes wrong, the industry leader is just a mouse click away.

Anything that is free is certainly enticing, but does zero cost mean good value? As someone once said, “The bitterness of poor quality remains long after the sweetness of low price is forgotten”.

2016 Young Technology Scholars Winners Announced

Micro Focus and the Utah State Board of Education are pleased to announce the 2016 winners of the Young Technology Scholar award. The award recognizes talented high school students who demonstrate impressive technological skill and well-rounded character. For almost 20 years, the Young Technology Scholar award has been granted to 121 students from 33 schools around Utah and helped fund scholarships to further their education in pursuing a technology related degree.

Micro Focus and the Utah State Board of Education funds scholarships for local Utah youth

Technology is a big part of our world—and the youth of Utah have noticed. In the last sixteen years, the number of engineering degrees in Utah has increased by 130%. It’s a positive trend, so to encourage local tech-savvy high school students, Micro Focus and the State Board of Education created a scholarship program called Young Technology Scholars. The program awards talented, self-motivated high school students scholarship money that they can put towards the degree of their choice.

The founding of Young Technology Scholars

The Young Technology Scholar program was started in 1997 by the Utah State Board of Education and by Novell (now a part of Micro Focus). Together, each contributed $250,000 to create a scholarship fund as a way of rewarding talented high school students for their high achievements in both technology and personal pursuits.

To be considered for the Young Technology Scholars award, recipients had to be highly involved in technology related courses at their high school or applied technology college. Additionally, each participant had to obtain a Micro Focus or SUSE Linux certification before receiving the scholarship award. The application process required students to submit an application, a résumé, two letters of recommendation, a transcript, and a description of their community service. The program wanted to ensure the scholarship money was put to good use, so students were asked to describe their future goals and how they planned to use the money.

Shirley Reynolds, Training Coordinator for Micro Focus and a member of the Young Technology Scholars board, has been involved with the program for several years and enjoys the excitement and energy the recipients bring each year. Reynolds says that the selected students are highly involved and dedicated to both their schools and their communities. “When you’re involved with something outside of school, it tells what kind of person you are,” Reynolds said. “It shows how motivated you are to get everything out of life that you possibly can.” The Young Technology Scholar program furthers students’ motivation to be involved by giving them a goal to work towards and rewarding them for their efforts.

Focus on education and opportunity

The biggest goal of the program is to help high school students realize the importance of a college education. “We want to help students focus on getting a degree they are really interested in while relieving some of the financial burden,” said Reynolds. The program encourages students to pursue their true interests, including computer science and technology. One 2016 winner, Corbin Hinton, plans to put his scholarship money towards pursuing a degree in technology while also studying music.

A major part of the program is increasing students’ technological skills in order to strengthen their résumés. According to Reynolds, the skills recipients learn while applying for the scholarship benefit their future careers. Many past winners were able to create their own small IT businesses in high school using the skills they learned in their technology classes. Other past recipients have gone on to work for large computer software companies throughout Utah. “It’s really cool to see where the winners go throughout their lives,” Reynolds said. “It’s incredible to see the energy and the excitement these students have.”

Involved students also have the chance to develop and act on new ideas. “They’re so creative,” said Reynolds. “They’re creating new apps and tearing down and rebuilding computers from a really young age.” One 2016 winner, Anthony Bible, experimented with a Raspberry Pi device to program a robot with sonar detection. He enabled the head of the robot to move up and down to change camera views. Another 2016 winner, Tim Allison, installed a wireless hotspot that could turn pictures upside down.

Shaping the future

Many recipients hope to work in an IT environment once they finish school. Andrew Roberts, a 2016 winner, wants to work for Micro Focus after graduating to further improve his technical skills while benefitting the company. Other students wish to use their technology skills in less conventional ways. 2016 winner Joshua Dickison wishes to study aerospace engineering and design his own aircraft. Click here for more information about the 2016 winners.

Young-Tech-Scholar-2016-Group-Photo

For almost 20 years, Young Technology Scholars has awarded 121 scholarships to students from 33 high schools around Utah, donating over $170,000. Talented, creative winners have obtained or are working to obtain various college degrees. Micro Focus and the State Board of Education plan to continue aiding local high school students in pursuing their passions during their college career.

PeterAtkins

 

 

 

 

Dr Peter Atkins
Academic Programs, Support and the TTP

DevOps – pressing ahead

In an IT world that seems to be accelerating all the time, the clamour for faster delivery practices continues. Derek Britton takes a quick look at recent press and industry reports.

Introduction

In many customer meetings I tend to notice the wry smiles when the discussion turns to the topic of IT delivery frequency. The truth is, I don’t recall any conversation where the client has been asked to deliver less to the business than last year. No-one told me, “we’re going fast, and it’s fast enough, thanks”.

The ever-changing needs of an increasingly-vocal user community guarantees that IT’s workload continues to be a challenge. And this prevails across new systems of engagement (mobile and web interfaces, new user devices etc.) as well as systems of record (the back-office, data management, number crunching business logic upon which those systems of engagement depend for their core information).

Moving at pace, however, needs to be carefully managed. Less haste, more speed, in fact. Gartner says a quarter of the Global2000 top companies will be using DevOps this year. Let’s look to another deadline-driven entity, the press, for a current view.

wordle5

Banking on DevOps

Speaking to a conference of over 400 at a DevOps conference in London, ING Bank global CIO Ron van Kemenade says investment in new skills and a transition to DevOps is critical as the bank adjusts to a mobile and online future through its “Think Forward” digital strategy.

“We wanted to establish a culture and environment where building, testing and releasing software can happen rapidly, frequently and more reliably. When beginning this journey we started with what matters most: people,” van Kemenade says.

Putting the focus on engineering talent and creating multi-disciplinary teams where software developers partner with operations and business staff has led to more automated processes, a sharp reduction of handovers and a “collaborative performance culture”, he adds.

Speaking at the same event, Jonathan Smart, head of development services at Barclays, talked up an eighteen-month push by the bank to incorporate agile processes across the enterprise

Over the past year-and-a-half, the amount of “strategic spend” going into agile practices and processes has risen from four percent to more than 50%, says Smart, and the company now has over 800 teams involved

To accelerate its own transformation, BBVA has adopting a new corporate culture based on agile methodologies. “The Group needs a cultural change in order to accelerate the implementation of transformation projects. It means moving away from rigid organizational structures toward a more collaborative way of working”, explains Antonio Bravo, BBVA’s Head of Strategy & Planning. “The main goal is to increase the speed and quality of execution.”

Worth SHARing

Little wonder that the IBM mainframe community organization, SHARE, is continuing a significant focus on DevOps at the forthcoming August 2016 show in Atlanta. Tuesday’s keynote speech is called z/OS and DevOps: Communication, Culture and Cloud”, given by members of the Walmart mainframe DevOps team.

Meanwhile, an article featured in Datamation, and tweeted by SHARE, provides further evidence and arguments in favour of adopting the practice. It cites a report from “2016 State of DevOps Report” which says, “[Developers using DevOps] spend 22 percent less time on unplanned work and rework, and are able to spend 29 percent more time on new work”

shareatlanta

Time to Focus

Of course, Micro Focus are neither strangers to SHARE nor to DevOps. At a recent SHARE event, we attended the DevOps discussion panel, discussing technical, operational and cultural aspects.

More recently, Micro Focus’s Solution Director Ed Airey penned an informative article published in SDTimes, outlining a smart approach to mainframe DevOps. The rationale, he says, is simple – competitive pressure to do more.

“Competitive differentiation depends on [organizations’] ability to get software capabilities to market quickly, get feedback, and do it again”

Addressing major challenges to make DevOps a reality, in both mainframe and distributed environments, Airey talks about how major question marks facing DevOps teams can be tackled with smart technology, and refined process; questions such as collaboration, development process, culture, skills, internal justification. He concludes with encouraging projected results, “Standardizing on common tooling also enables productivity improvements, sometimes as high as 40%.”

Of course – not everyone is convinced

Modern delivery practices aren’t for everyone. And indeed some issues sound quite daunting. Take Cloud deployment for example.

Sounds daunting? A recent Tech Crunch article certainly thought so.

We are treated to a variety of clichés about the topic such as “ancient realm” and “the archaic programs”. However, the publication failed to notice some important things about the topic.

Central to the piece is whether COBOL based existing systems could be “moved” to another platform. The inference was that this was an unprecedented, risky exercise. What’s perhaps surprising, to the author at least, is that platform change is no stranger to COBOL. Micro Focus’ support of over 500 platforms since its inception 40 years ago is supplemented by the fact that the COBOL language, thanks to our investment, is highly portable and – perhaps most importantly in this case – platforms such as the Cloud or more specifically Red Hat (alongside SUSE, Oracle and many other brands of UNIX too) are fully supported with our Micro Focus range. That is to say, there was never any issue moving COBOL to these new platforms: you just need to know who to ask.

cloud1

Moving Ahead

Anyway, I can’t stop for long, we’re moving fast ourselves, continuing the DevOps discussion. Upcoming deadlines? Find us at SHARE in Atlanta in August, or visit us at a DevDay in the near future, or catch up with us on our website where we’ll be talking more about DevOps and smarter mainframe delivery soon.

Was hat Schokolade mit Sicherheit zu tun?

Dass Schokolade so manchen Durchhänger im Büro mildern kann, wissen wir alle und außer vielleicht für die Figur stellt so ein kleines Stückchen süßer Sünde nun eigentlich auch keine wirklich große Gefahr dar, oder? Doch was würden Sie sagen, wenn ein Mitarbeiter oder ein Kollege sein wichtiges Computer-Passwort für ein Stück Schokolade verrät? Das halten Sie für unmöglich, denn so dumm kann doch wirklich keiner sein, oder? Wenn Sie sich da mal nicht täuschen …

Von beidem könnte es immer noch ein bisschen mehr sein, könnte man denken. Doch sowohl zu viel Schokolade als auch zu viel Sicherheit können kontraproduktiv sein. Das ist also nicht die Antwort auf die Frage, sondern der Hinweis auf eine aktuelle Studie. Die Forscher der International School of Management in Stuttgart und der Universität Luxemburg untersuchten wie leicht Nutzer ihr Passwort preisgeben. Die mit 1.206 zufällig ausgewählten Personen durchgeführte Studie zeigt: Für eine kleine Gefälligkeit verraten Menschen wildfremden Leuten ihr Passwort. Die Wissenschaftler verwendeten für ihre Studie dabei Vorgehensweise von Trickbetrügern – sie schickten sieben studentische Hilfskräfte los, die den zufällig ausgewählte Passanten erzählten, sie würden eine Umfrage zum Thema Computersicherheit durchführen. Zur Belohnung gab es eine Tafel Schokolade. Nach kurzen Fragen zum Thema baten die Tester die Probanden ihr Passwort auf den Umfragebogen zu schreiben. Das Ergebnis ist erschreckend: Insgesamt 36,8 % der Befragten nannten ihr komplettes Passwort, weitere 47,5 % gaben auf Nachfrage zumindest deutliche Hinweise auf Bestandteile des Passwortes. Schokolade als Köder unmittelbar vor der Passwortfrage verstärkte das Ergebnis – fast jeder zweite Teilnehmer gab sein vollständiges persönliches Passwort preis, wenn er direkt davor eine Tafel Schokolade bekommen hatte.

Diese Art der Betrügerei nennt sich “Social Engineering” oder “soziale Manipulation”. So wie Hacker technische Schwachstellen suchen, um in Computersystem einzudringen, nutzen Trickbetrüger  menschliche Schwachstellen aus – sie “hacken” ihre Opfer gewissermaßen und versuchen durch gezielte Beeinflussung, vertrauliche Informationen zu bekommen. Ausgenutzt werden unter anderem Sympathie für scheinbar ähnliche Menschen, Autoritätszugehörigkeit, Gier oder Neugier als psychologisches Prinzip. Während wir technische Schwachstellen und Lücken in unseren Computersystemen dank immer besserer Technologien und neuer Updates schließen können, bleiben die menschlichen Schwächen hingegen nahezu gleich.

Auf das Passwort alleine ist daher kein Verlass – bessere Lösungen für eine sichere Zukunft müssen her

Iris2blog

Lange stellten Passwörter den besten Kompromiss aus Sicherheit und Nutzbarkeit dar. Doch mit dem Wachstum der mobilen Belegschaft, der zunehmenden Akzeptanz von Cloud-Diensten  und Cloud-Apps und durch immer stärkere Vernetzung, wodurch immer mehr sensible Unternehmensdaten  zwischen Mitarbeitern und auch Geschäftspartnern  bewegt und gemeinsam bearbeitet werden, sind Geschäftsdaten heute einer höheren Gefahr durch Hacker, Betrüger und Cyberdiebstählen ausgesetzt. Erst kürzlich musste die Telekom mitteilen, dass 120.000 Kundendaten im Darknet aufgetaucht sind und dass diese Daten zumindest teilweise echt und aktuell seien. Die Telekom ist mitnichten ein Einzelfall, erst Anfang Juni diesen Jahres wurde bekannt, dass auch die Zugangsdaten von 32 Millionen Twitter-Nutzern zum Verkauf stehen. Zuvor betraf es LinkedIn und MySpace. Und wenn selbst der Twitter und Pinterest Account von Facebook Chef Mark Zuckerberg gehackt werden kann, wie Anfang Juni Venture Beat berichtete, dann ist es offensichtlich, dass die jahrelang gültige Devise des „Schützens und Verteidigens“ durch Investitionen in hochentwickelte Firewalls und Virenschutzprogramme heute nicht mehr alleine ausreichen für den Schutz sensibler Daten.

Starke Authentifizierungsverfahren hingegen, wie Multi-Faktor Authentifizierung, sind in der Lage Identitätsdiebstähle zu begrenzen und somit die Sicherheit zu steigern. Dahinter steckt die Idee, drei Grundbereiche der Authentifizierung miteinander zu verknüpfen: Etwas, das der Nutzer besitzt, wie ein Token oder ein Smartphone; ein Körpermerkmal des Nutzers, zum Beispiel ein Fingerabdruck und andere biometrische Merkmale; und drittens etwas, das der Nutzer weiß, eben ein Passwort, eine PIN oder ein Einmalpasswort (OTP = One Time Password). Die Anzahl der Authentifizierungs-methoden wächst stetig. Die Auswahl geht von Standards wie Fingerabdrücken , Einmalpasswörter und Smart Cards bis hin zur Mustererkennung durch die Kamera oder auch das Nutzen eines Zertifikats auf der SIM-Karte im Handy.

MFA stellt neue Anforderungen an Entwickler

Stellt sich also die Frage, warum nicht schon vielmehr Unternehmen eine Mulit-Faktor-Authentifizierungsverfahren einsetzen. Dazu muss man wissen, dass Eine Welt ohne Passwörter hat große Auswirkungen auf verschiedene Sicherheitsbereiche, insbesondere auf die Anwendungssicherheit. Hier erfordert sie ein Umdenken bei Entwicklern. Die meisten älteren Anwendungen nutzen ihr eigenes Login-Dialogfeld. Die Entwicklung ist verhältnismäßig einfach. MFA hingegen ist komplexer umzusetzen. So muss jede Anwendung mit einer Vielzahl an verschiedenen Eingabeverfahren umgehen können. Der Programmieraufwand für jede Anwendung ist beträchtlich.

Access Management Lösungen auf Basis von Single-Single-On-Mechanismen (SSO) und Gateway Komponenten können diese Herausforderung lösen und den Vorgang auf eine einzige Anmeldung beschränken. Bei Bedarf und Zugriff auf schützenswerte bzw. kritische Daten und Dienste kann mittels sogenannter „Step up“ Authentifizierung z.B. ein zusätzliches Einmalpasswort abgefragt werden. Die Entscheidung einer zusätzlichen Abfrage kann dabei dynamisch und adaptiv erfolgen, z.B. wenn ein Anwender versucht mittels eines privaten Gerätes aus einem unischeren Land auf sensible Daten zuzugreifen.

Was bei MFA Methode zu berücksichtigen ist: Sicherheit muss auch praktikabel sein

Während die Erhöhung der Sicherheit das primäre Ziel jeder Authentifizierungslösung sein sollte, ist Benutzerfreundlichkeit nicht minder wichtig für den Erfolg bei Durchsetzung im Unternehmen. Sind die Prozesse für Benutzer zu kompliziert und unbequem wirkt sich das negativ sowohl auf die Produktivität als auch auf die Sicherheit aus. Wichtig ist es, ein angemessenes Gleichgewicht zwischen den Anforderungen an die betriebliche Handlungsfähigkeit und Sicherheit zu finden. Die  Planung eines für sie passendenden Multi-Faktor-Authentifizierungsverfahren sollten Unternehmen jedoch nicht nur an den aktuellen Status Quo Ihrer Anforderungen ausrichten, der Blick sollte sich auch auf zukünftige Bedürfnisse richten. Aspekte, wie Flexibilität in puncto Nutzung und Integration verschiedener Authentifizierungsmethoden oder Handhabbarkeit in Ausnahmesituationen (Beispiel: das Vergessen der Smartcard), sowie die TCO sind besonders zu berücksichtigen. In unserem Live-Webinar „ Identitätsbasierende Sicherheit für die hybride IT von heute und morgen“ am 26. Juni liefern wir Antworten, wie der IT-Sicherheitsstandard in einer hybride IT erhöht werden kann,  wie sie sich konkret vor Missbrauch der digitalen Identität schützen können was Unternehmen im Hinblick auf ein zukunftsfähiges Access Management  beachten sollten und welche Lösungen bereits heute verfügbar sind. Informationen und Anmeldemöglichkeit finden Sie hier.

Götz Walecki

Manager Systems Engineering

LV7A5495_1(1)

Introducing Micro Focus Enterprise Sync: Delivering Faster Change

Delivering Mainframe DevOps involves managing a lot more change a lot more often. This might need improving processes, but also demands more of technology. Amie Johnson unveils how Micro Focus is supporting collaborative change.

Introduction

At Micro Focus, we believe mainframe organizations can achieve DevOps levels of efficiency by just taking advantage of modern, efficient tools, agile development practices and fostering better team collaboration. It’s simply a matter of incrementally removing application delivery bottlenecks.

As such, Micro Focus just introduced a new product within our Enterprise Solution set aimed at helping mainframe developers deliver new releases, faster.

Enterprise Sync tackles head on one of the major delivery bottlenecks our customers encounter: coordinating and orchestrating rapid code change – needed in a DevOps model – using conventional mainframe configuration management tools.

The product supports rapid, contemporary parallel development to provide a means to adopt a more agile delivery method across mainframe development teams.

Why can’t we deliver multiple streams?

DevOps promises to eradicate delays in IT delivery. So, in the mainframe world, what’s the bottleneck?

One of the issues is all about how deliveries are managed. As robust as they are, trusted old mainframe configuration management tools weren’t designed to support parallel development, so multi-stream code merges are difficult, manual and prone to error. But, these mainframe configuration management tools hold unique configuration detail and metadata which are essential to supporting critical mainframe applications. So, while replacing such tools completely is out of the question, customers are looking for ways to support a more agile delivery model.

Removing Barriers

The Micro Focus solution, Enterprise Sync, helps solve the bottleneck associated with a desire to introduce parallel development activities. It does this by replicating mainframe source code to a distributed software configuration management platform. Code changes made via parallel development on the distributed platform are automatically synchronized with the mainframe SCM environment, such as CA Endevor. The integration and synchronization effectively introduces a new paradigm of speed and accuracy in delivering parallel development streams for mainframe delivery. This seamless integration with established software change management tools uniquely addresses the need to deliver faster change while preserving the organization’s valuable investment in mainframe processes and their software change and configuration management environment.

ES1

As part of the wider Micro Focus Enterprise product set, Enterprise Sync works collaboratively with our flagship mainframe application development tool, Enterprise Developer, to deliver:

  • Easier parallel development at scale across releases or teams
  • Greater efficiency through management and visualization of code change using modern tools
  • Alignment with current mainframe development process and source code
  • Improved developer productivity through continuous integration of key updates

ES2

ES3

Find out more

Establishing a modern mainframe delivery environment may be central to your DevOps strategy. Learn more about how Micro Focus can help with a complementary Value Profile Service. See what’s possible and hear more about how Micro Focus has helped transform mainframe application delivery.

Achieve DevOps levels of efficiency, flexibility and collaboration. Learn more about the new Enterprise Sync release on the website, or download the product datasheet.

ES4

Software Testing: Myths vs Reality

If you’re thinking about purusing a career as a Software Tester, this blog will make good reading! One of our junior testers Karthik Venkatesh puts pen to paper to help anyone starting out on a Testing career with some expectation setting. Here’s what he’s learnt so far.

“Testing started when the human race began”!

The whole analytical brain of the human mind is about doing verification and validation before concluding anything and Software Testing is no exception to this.

Market Outlook and Future for Software Testing

  • A survey by Global Software Testing Services Market (2016-2020) research analyst predicts the global software testing services market to grow at a CAGR of close to 11% during the forecast period.
  • According to a recent report by Fortune magazine- Software testing is listed among the top 10 in-demand careers of 2015

Software-Testing_banner-(1)

So aiming to pursue a career as a Tester or Quality Assurance looks like a good plan. Let’s take a look through some Myths and Realities of being a Software Test Professional:

Myths vs Reality about Software Testing

Myth-1: Testing is Boring

Reality: Testing is not boring or a repetitive task. It is like a detective’s job! Testing is a process of investigation, exploration, discovery, and learning. The key is to try new things. In reality, testing presents new and exciting challenges every day.

Myth-2: Testers do not write code

Reality: Some people may say that software test engineers do not write code. Testers usually require entirely different skill set which could be a mix of Java, C, Ruby, and Python. That is not all you need to be a successful tester. A tester needs to have a good knowledge of the software manuals and automation tools. Depending on the complexity of a project, a software testing engineer may write more complex code than the developer.

Myth-3: Testers job is only to find bugs

Reality: The job of a software test engineer is not restricted to find bugs. A tester should be more customer focused, understands how the system works as a whole to accomplish customer goals, and have good understanding of how the product will be used by the end-user. A tester has to understand the complete product architecture, how it interacts with the environment, how the application works in a given situation, and how the application integrate with all the components and work seamlessly.

Myth-4: Software testers are paid less than the developers

Reality: These days quality of the product directly effects the products’ or the brands’ reputation. So no organizations are ready to compromise on quality. Organizations are always looking forward to work with energetic testers. An efficient software tester can draw more salary than the developer of similar experience.

myth

Top 6 tips for Software Test Engineer starting out on their career

  1. Development and testing are moving closer to the business units and you will need to communicate and work closely as a team.
  2. To find bugs, you will need to be creative. A software test engineer needs to come up with new ideas which would help in finding bugs. Work smart as well as hard! Always find better and simpler ways to do the assigned tasks, own tasks proactively and innovate.
  3. A good tester is the one who knows the application in and out. The tester should be aware of all the components in a product and the business logic behind it. Good knowledge of the product helps to understand the importance of a feature from a business perspective so become the expert!
  4. Always want to learn more!
  5. Try to hone some skill sets such as good negotiation skills, thinking out of the box, and multi-platform skills
  6. You will need to be persuasive and explain to the stakeholders which bugs have been found and how they are likely to impact on end-users and the business.
  7. You must be a perfectionist and resilient to pressure as Testing is the typically the last gate before the product reaches into the hands of customer.

Corporations cannot hire customers, so they hire software test engineers who put products through their paces in on potential Customers behalf. So, to represent customers within a corporate – What kind of a hat would you wear – a purple hat, a yellow, a blue, or a white?

Customers have different approaches to use a product. If you consider each approach as a colored hat, a test engineer needs to wear a wide variety of hats of different colors and shapes.

Testing is a career which is built with innovative thinking – be passionate about it and be strong enough to make your own choices work! Don’t forget to read the  Micro Focus approach to Software Testing and view our  impressive range of testing Products.

Karthik

 

 

 

 

Karthik Venkatesh

Move beyond weak mainframe passwords with advanced multifactor authentication

Flexibility is the key when it comes to multifactor authentication and you can also use these same methods to authorize access to your host systems as well. You can set up different authentication requirements for different types of users and manage everything from a central console. David Fletcher provides more insight in his blog….

More and more companies are moving to multifactor authentication. Almost everyone agrees that multifactor authentication is the best way to provide the strongest level of authentication (who you are). This technology is taking hold in many industries, and for the most part it’s working pretty well. Now ask yourself “How can I use multifactor authentication to authorize access to my host systems?”

thumb

Complex and Expensive?

Wow—things just got really complicated and expensive. Think about who is accessing your host systems today. Employees all over the world with different devices and different access needs. Business partners who need access but don’t have your same systems and devices. What about customers who are actually updating their own data via web services on your host systems? The level of complexity that comes with implementing multifactor authentication for enterprise applications is hard enough. Now throw in the mainframe and it’s enough to keep anyone from moving in that direction.

But what if there was a flexible and manageable way to use multifactor authentication for host applications? Because Micro Focus is the expert in securing and managing access to your host systems, we have developed new capabilities to make implementing and managing multifactor authentication flexible and affordable. You can even use the same products for implementing multifactor authentication for your enterprise applications and authorizing access to your host systems.

Affordable and Flexible:

The key to making multifactor authentication affordable and flexible is having a system that supports many different ways of authenticating. Such a system could support whatever methods of authentication are right for your users and your budget.

There are many different ways that a user can be authenticated. You can take advantage of the fact that most (if not all) employees or partners have a cell phone. No need for costly devices to increase security to your systems. What if you could let a partner choose between answering three security questions or using a fingerprint for authenticating or a combination of questions and cell phone?

Flexibility is the key when it comes to multifactor authentication. Now you can also use these same methods to authorize access to your host systems as well. You can set up different authentication requirements for different types of users and manage everything from a central console.

Micro Focus® Advanced Authentication, combined with Host Access Management and Security Server (MSS) and one or more of our terminal emulation clients, provide up to 14 different methods of authentication to authorize access to host systems. As new technologies emerge, you can count on Micro Focus to stay ahead of the game so that when you are ready to make a move, we are too.

To learn more about enabling multifactor authentication to authorize access to your host systems, contact your Micro Focus sales representative today.

Originally published here

Is The Mainframe a Hacker’s Target?

Business-critical mainframe systems are accessed daily by millions of users. Industry expert Ron LaPedis takes a hard look at the security risks, and explores how to plug the major gaps

A variety of Terminal Emulation solutions enable millions of users to access their mainframe computer systems. The choice of terminal emulation solutions ranges from thin hardware clients, to thick software clients, to thin software clients running in a browser. Most of these clients interpret the data streams being passed back and forth from the host using protocols such as 3270, 5250, VT, X-windows, T27, UTS, or 6530, and reformat it for display on more modern devices such as PCs and tablet devices.

These more modern devices are all connected to the mainframe using standard Internet Protocols – which means that the data can be sniffed or even modified. And not only that, depending on how old the mainframe code is, Personally Identifiable Information (PII) might be displayed. In some cases, this is in violation of HIPAA, PCI DSS, EU Data Protection laws, or other rules and regulations that didn’t exist when that code was written.

Mainframe Security

As a result of serious vulnerabilities within SSL and early TLS, organizations can be put at risk of data breach. In fact, the Payment Card Industry Security Standards Council (PCI SSC) mandated that data communications are to be protected by TLS 1.1 or later (as of June 30, 2016). Even though NIST deprecated (killed off) SSL as of 2014, the 2016 deadline was moved to 2018 to give member organizations extra time; which of course gives hackers extra time too. The existence of the POODLE and Heartbleed exploits, among others, prove that anyone using SSL and early TLS risks being breached.

IAS blog 2

Breaking In

Can we talk about passwords for a moment? Most applications were written in simpler times when 8-character passwords were the norm. And Multi-factor authentication? Forget it!

The chances are that critical mainframe applications (and administrator accounts) are not only limited to 8-character passwords, but 8-character passwords which contain only letters and numbers – taking less than six hours to crack.

And then there are question marks around the use of Java due to its vulnerability-of-the-week history. Many browser-based Terminal Emulation software clients require a specific version of Java running on a specific browser version – which may have its own vulnerabilities. It’s not unreasonable to say that Java is somewhat notorious as a security trap door.

Defending the Estate

Mainframe security matters. Today’s terminal emulation software packages need to be secure, manageable, and easy to use. It doesn’t matter whether users are on thin clients, PC, Mac, or mobile devices. And large number of terminal emulation protocols, along with specialized host software (such as airline reservation systems), must be supported.

Whether internal policy requires a management server on a Linux partition within the mainframe or on an external Host Access Management and Security Server (or MSS), modern mainframe security solutions need to:

  • Centrally manage terminal emulation access to your host systems by using your existing Identity and Access Management systems
  • Easily update terminal emulation user configurations to meet evolving security requirements
  • Quickly validate compliance of terminal emulation for securing sensitive information
  • Ensure that end users could not make changes to their user configuration
  • Partially or fully mask data fields based on the user’s role
  • Enforce data input standards and cross-screen validation
  • Implement long complex passwords and multi-factor authentication.

 A Fresh View On Mainframe Terminal Emulation

Such lofty objectives are, however, not the stuff of dreams. All of this is possible today. Our relentless focus on customer success, Micro Focus has invested to create a new generation of powerful, secure and comprehensive emulation products.

Tackling all the requirements above, additional capabilities include end-to-end encryption of data streams, centralized management, partial or full field masking of sensitive data, multi-factor authentication, integration with Microsoft Office tools, and linkage to other Micro Focus identity management software for user lifecycle management.

Without touching a line of code on the host, you can lock down access to your mainframe, meet industry-specific rules and regulations, and prevent data from being changed or being taken out of the organization through traffic monitoring or impacting the business through modification.

Additionally, power users can now create entirely new ways of viewing and manipulating core business data; again without modifying a line of mainframe code. Creating powerful and user friendly windows or web-based applications from dated green screen applications is just a few clicks away.

The mainframe is a powerful part of organizational value. It must be web and mobile device ready, but also totally secure. Whether organizational security direction is coming from the board,  auditors,  business units, end-users, or more importantly, the customers, Micro Focus provides powerful solutions that can help address these requirements by making access to core mainframe applications secure and friendly.

LaPenis

 

 

 

 

Ron LaPedis

Global Sales Enablement Specialist