DevPartner: how to avoid slow code under the .NET framework – part 3

This blog series continues with tips for managing database calls to improve programming performance under the .NET framework.

Slow code – what are the causes and what can you do about them? Part 3

Previously we discussed the importance of selecting your programming constructs wisely, and managing your memory as efficiently as possible to reduce the threat of slow code. Applying some of these tips for database management will also improve application performance during production.

Database Calls

Database calls tend to be computationally expensive because they frequently go out to disk, and queries can take a lot of time to execute.  An application making frequent database calls will be slow as the application pauses to write and retrieve data from disk.

Good programming practice involves creating an in-memory database object, using ADO.NET, LINQ, or similar technique. This brings the data needed by a particular routine into application memory.  However, even this approach isn’t guaranteed to provide needed performance, particularly if database objects haven’t been constructed appropriately.  In such cases additional data may have to be retrieved or the application retains data in the database object long after it should be returned to the database and the memory reclaimed by the system.

Databases themselves also offer caching, and it’s incumbent on the development team to work with the DBA to ensure caching is optimized for the application. While this may involve a measure of trial and error, database tuning at this point in the application lifecycle will ensure better performance during production.

Next, we’ll look at how to identify slow code in your application.

Blog article submitted by Peter Varhol, Technology Strategy Research

What’s new in Modernization Workbench Version 3.2.

Modernization Workbench delivers true insight into enterprise applications for executives and development teams. Executives gain insight into value, cost and risk, and rich technical insight enables development teams to adapt applications to address strategic goals.

Modernization Workbench delivers true insight into enterprise applications. Executives gain top-level insight into value, cost, and risk of applications, to identify and drive modernization and development priorities. Rich technical insight means that development teams can adapt and modernize their applications to respond to strategic goals. 

The new v3.2 release of the Modernization Workbench family of products delivers enhanced value across five key areas:

More Powerful Analysis:  Modernization Workbench has always set the standard for depth of analysis.  Now, new capabilities mean that advanced users can extend language and platform support. For example, it gives users the ability to capture additional insight into the structure of XML-based Application Frameworks, such as a proprietary Java framework, among others. 

Broader Coverage:  Users with C# .NET can take advantage of Modernization Workbench to generate Application Portfolio Management metrics and perform high-level impact analysis.  The result is better decision-making across even highly heterogeneous portfolios.

Analysis at Your Fingertips:  A new ‘assistant’ feature dramatically accelerates analysis. Now users can instantly pivot from an artifact of interest to related analytics that are in context.  This delivers faster outcomes for any application assessment.

Enhanced Analysis in Visual Studio:  New analytical capabilities have been added to the Visual Studio 2010 based Analyzer Express module.  Users can access Control Flow and Execution Path analysis within the development environment they use every day.  This delivers greater insight into how a COBOL application is structured, helping to speed development.  Convenient query libraries and visualization tools help identify issues in code that could impair quality and maintainability.

Better Business Decisions:  New enhancements to Enterprise View simplify the organization of metrics by business process, geography, and development team.  Now managers gain valuable insight to determine where to focus resources to ensure that their applications are modernized.

DevPartner: how to avoid slow code under the .Net framework – part 2

Continue reading these tips on avoiding slow code when programming for performance under the .Net framework in this blog series.

Previously we looked at how selecting programming constructs wisely can help you avoid slow code. The importance of memory management to keep your code up to speed takes center stage in this article.

Application Memory

Poor memory usage is also an area that can slow down application execution.  Even in managed environments, such as .NET, attention to memory is important, even though the system allocates and reclaims all memory.  Improper memory management can lead to large numbers of long-lived objects, huge objects that serve no programmatic purpose, and to object leaks.  All of these contribute to memory bloat and slow execution.  If not resolved, they can ultimately lead to application crashes and out of memory errors.

Within managed environments, many programmers still prefer managing their own memory, and the .NET Framework provides a limited ability to do so.  However, the garbage collector in the .NET Framework will almost always do a superior job of managing memory than the programmer.

There are techniques available to the programmer to make sure that the memory footprint doesn’t grow unnecessarily.  One important goal for programmers at this point is to ensure that as many objects have as short a life as possible.  This is done by making as many variables local as possible, and enabling them to go out of scope when their method is complete.  Once out of scope, their memory can be quickly reclaimed by the .NET garbage collector.

What is your experience with managing application memory in your development? Share them with us below.

Blog article submitted by Peter Varhol, Technology Strategy Research

This blog series continues with part three tips for managing database calls to improve programming performance under the .NET framework.

Announcing extend® 9

Important news for extend and RM/COBOL users

We are delighted to announce that extend 9, the latest version of extend will be available from December 17th. With this release extend has been improved to meet the requirements of users of earlier versions of extend as well as users of RM/COBOL who are looking to move to extend.

extend 9 has enhanced .NET support, improvements to Windows native look and feel, simplified naming for DLLs and shared libraries, plus a new module, Xcentrisity BIS, which simplifies the creation of Web Application Services.

  • Xcentrisity Business Information Server (BIS) is a native COBOL environment for building and deploying Web Services.
  • The BIS Server is a web server environment that manages application sessions and makes them available via any web browser or other web user agent that is granted access to the BIS server.
  • Application developers gain the capability to move applications forward by building Service Oriented Architecture (SOA) applications incorporating legacy business data and logic freely mixed with the latest web languages and tools.
  • Application developers also gain a unique opportunity to build state-of-the-art browser-based web applications or SOAP-based Web Services comprising ACUCOBOL-GT® programs and COBOL data files and databases.

Many RM/COBOL users already make great use of Xcentrisity BIS and its benefits are now available to users of the extend family of products. Having BIS as part of the extend product set will be of benefit to those RM/COBOL users looking for a simple step to a more modern COBOL environment.

More details can be found at, including a document on the value of upgrading to extend 9 and beyond.

We would like to thank all our beta testers who helped us make the extend 9 the high quality product that it is and look forward to working together with you all as you take your COBOL applications forward.

DevPartner – how to avoid slow code under the .NET framework

Read these tips on avoiding slow code when programming for performance under the .NET framework in this new blog series.

Slow code – what are the causes and what can you do about them?

Welcome to this short series of blog articles where we provide a practical guide to identifying and addressing some of the core issues around slow code.  We kick the series off by looking ay the importance of selecting Programming Constructs wisely. Future blogs will consider factors like Application Memory and Database Calls. Please feel free to comment and share your experiences with us.

The importance of selecting Programming Constructs wisely

Programming for performance in managed code under the .NET Framework is geared towards optimizing those operations that carry a heavy computing cost, such as database calls, memory allocations (especially large ones), floating point operations and certain Framework calls. 

.NET performance analysis helps determine which calls are more expensive than others.

Most or all of these expensive operations will have to be executed at some point, the trick is to do them as seldom as possible, in more efficient ways. This is where combining performance analysis with code reviews pays off as, armed with performance information, code reviews can focus on refactoring to achieve better code execution times.

A development team may well focus on the time taken to execute code, but equal attention needs to be paid to the frequency of executing a particular piece of code. A single call might be relatively inexpensive but placing that call in a loop could involve executing an expensive operation thousands of times.

That, in fact, represents a significant difference between managed and native code.  Memory allocations are inexpensive in native code, and releases are comparatively expensive.  In managed code, the opposite is true – allocations are expensive, and releases are inexpensive.

The .NET Framework provides alternative means of performing many actions.  When there are several choices available, developers should choose the calls that perform the actions they require, without a lot of additional computation.  Developers should analyze the performance of alternative Framework calls to identify the best calls for their purposes.

Full performance analysis, such as that available within DevPartner, traces the time it takes to execute every line of code, and every operation, as well as how many times that line is executed.  This gives insight and provides the ability to fully understand how fast it takes to execute code, down to each individual line.

In the next article we examine how to identify poor memory usage within applications.

Application Modernization: Fundamentals for Success in the Post-Crisis Financial Services Industry – White Paper

This paper discusses three major drivers of change which modernization can help to address.

This paper discusses three major drivers of change which modernization can help to address: (1) the need for increased business and IT agility, (2) the mandate to reduce the cost of IT across the institution, and (3) the requirement to comply with new and increasing regulations expected to emerge over the next several years. Through systems and applications modernization, IT departments can deliver:

  • Important cost reductions on day-to-day operations
  • Shortened time to market for new products and services
  • A self-funded path to new technical capabilities and business agility
  • Recognition of IT’s strategic value across the financial institution

Click here to download the full report.