Requirements Analysis through Function Point Automation

Requirements Analysis through Function Point Automation

In 1987, American computer architect, software engineer, and computer scientist Fred Brooks suggested that “the hardest single part of building a software system is deciding precisely what to build. No other part of the conceptual work is as difficult as establishing the detailed technical requirements . . . No other part of the work so cripples the resulting system if done wrong. No other part is as difficult to rectify later."[1] Nearly 30 years later, Brooks’ assessment remains true. Poor requirements have long been a problem across both industry and the federal government. Business process and requirements analysts face the challenge of effectively parsing through hundreds and sometimes thousands of requirements in an effort to define system design needs, determine the expected software size, and accurately estimate the software development cost. This detail-oriented and often grueling process requires analysts to be highly experienced and also have a tremendous amount of tolerance for repetitive requirements parsing—and despite even the most meticulous analysis, mistakes are still an inevitability. 

Innovation in systems and software engineering has virtually exploded in the 30 years since Brooks made his observation. Machines—once solely reliant upon human programming to operate—are now capable of “learning” processes and automating previously human-dependent functions. Such innovations in automation and machine learning can serve to aid analysts in understanding and parsing requirements, identifying duplication in both language and meaning, and dramatically reducing the time and effort necessary to accurately analyze projects. Breakthroughs in machine learning and natural language processing (NLP) utilize function point counting to size requirements, making it possible to perform system analysis by developing entity relationships using the original requirements. 

Today’s analysts from both government and industry can leverage these developments in machine learning, natural language processing, and automation—innovations that undoubtedly would have dazzled computer scientists in 1987—to mitigate the challenges presented by ambiguous requirements, requirements creep, defective requirements, and “gold plating.” These technologies make it possible to verify effective requirements and easily identify defects analysts might otherwise miss. What’s more, such capabilities produce more accurate cost estimates and allow the analyst—who so often is not a subject matter expert—to achieve a deeper understanding of the requirements. Perhaps most importantly, these innovations automate and streamline processes, reducing the risk of errors and producing results within seconds, not multiple man-hours. The result is a system that leverages innovation in technology, augments our human capabilities, saves time and money, and produces more accurate analysis that ever before. Although the challenges facing government and industry requirements activities remain much the same as they were in 1987, the solutions available to us are infinitely more advanced. By effectively leveraging today’s technology, we ensure these challenges truly become a thing of the past.

At Logapps, we pride ourselves on our ceaseless pursuit of innovation: we are committed to introducing new technologies that help our customers work smarter, eliminate time-consuming processes, and incorporate tools that ease the transition to more ad hoc solutions. Logapps’ newest innovation is the Function Point Automation (FPA) Tool—a product that began as a research and development (R&D) endeavor to explore breakthroughs in machine learning and NLP. In our work with diverse federal government agencies—including the Internal Revenue Service (IRS), Navy, and U.S. Courts—we identified challenges common across the government. Analysts struggle to manually sort through scores of data to identify duplicates and isolate poorly written, incomplete, or ambiguous or missing requirements. Accurate software size estimates are difficult to determine, and the traditional manual function point counting method for cost estimation requires expert and trained personnel and an inordinate amount of time and resources to complete—even then, the analysis is prone to error, resulting in inconsistencies that trickle down into product development and hinder the performance of even the most meticulously planned project. The onerous task of requirements analysis and software cost estimation has historically put enormous strain on agencies’ budgets and schedules. Over the course of two years, Logapps analysts applied their research in the areas of NLP and machine learning to develop a tool capable of automating these complex processes and providing a nearly instantaneous cost estimate for a proposed requirement.

For the past nine months, Logapps has piloted the FPA Tool in-house with great success. In one trial, Logapps analysts compared the tool’s performance against the traditional human-based requirements analysis process. The FPA Tool analyzed 840 requirements in 1,147 seconds, compared to the 100 man-hours it took for analysts to perform the same analysis. Across the board, the FPA Tool identified 400 percent more duplicates, similarities, and poorly written/incomplete requirements than the human analysts (80 vs. 20). Using NLP to parse requirements, the FPA Tool produces consistent and immediate estimations, decreasing experts’ time spent analyzing requirements and making it possible to perform quality control and manage software budget sizes for a fraction of the cost and time required under the human-based method. The benefits to the client are multifaceted: for cost estimation analysts, the FPA Tool performs function point counting analysis to estimate the cost of a set of requirements; for requirements writers, the tool provides immediate feedback on the clarity of the requirement statement and the approximate cost of a requirement; and for software developers, the tool identifies and removes duplicate requirements and summarizes the main abilities and functionalities of the software.

Buoyed by the success of the pilot program, Logapps is excited to share the FPA Tool with our clients, transitioning the tool from an in-house R&D project to a product available for use across our federal client portfolio. We view this tool as an extension of our service capability: a vehicle to save time and money, increase customer satisfaction, and leverage our passion for innovation to relieve the burden of complex manual processes—ultimately enabling our clients to devote their resources to managing and delivering exceptional products, not wasting valuable time and effort on tedious data analysis activities.

Are you ready to test-drive the tool in your organization? Contact us to schedule a demo and learn about how the FPA Tool will transform the way you do business.

[1] Brooks, F. "No Silver Bullet: Essence and Accidents of Software Engineering." Computer 20, 4 (April 1987): 10–19.

ISO 9001:2008 Certification

ISO 9001:2008 Certification

 

ISO 9001:2008 Certified

Logapps LLC, a leader in cost estimation, business case analysis, analysis of alternatives, and program management support for federal IT systems, has been awarded ISO 9001:2008 certification for its Quality Management System, demonstrating its uncompromising commitment to providing the highest quality analysis and services to its customers.

“Obtaining this certification validates all of our efforts to improve the quality, delivery and dependability of our analysis and services,” said Ed Spriggs, Managing Partner of Logapps LLC. “This is a continuation of our efforts to establish ourselves as a reputable consulting firm and is a significant milestone in our continuous improvement efforts.” said Kevin McKeel, Managing Partner.

Logapps LLC received its ISO 9001:2008 registration from IMSM, an accredited registrar that performs assessments of management systems against requirements of national and international standards for quality. Logapps LLC Quality Management System and ISO 9001:2008 certification is applicable to all facets of its operations.

What is the ISO 9001:2008 Certification?

ISO, which stands for International Organization for Standardization, is a quality management system designed to help organizations ensure that they meet the needs of customers and stakeholders while meeting statutory requirements related to a product.  Headquartered in Geneva, Switzerland, ISO was first established in 1987 and has issued over 1.1 million certifications to organizations in 187 countries. Certification to the standard is used classically in global supply chains to provide assurance about suppliers’ ability to satisfy quality requirements and to enhance customer satisfaction in supplier-customer relationships.  Today, ISO has evolved into a quality service that is becoming increasingly popular amongst IT and federal government agencies.

Ways ISO certification can help you in your organization:

One of the top reasons why Logapps LLC obtained the ISO 9001:2008 certification is because we want to demonstrate to our clients, we practice what we advise.  Logapps LLC is recognized for our quality of work through our repeat clientele and we want to increase that recognition through a well-established institution.  By maintaining transparency in our commitment to continually improve, it is our goal to continue to add value and be a leading IT consulting firm for data and analysis.

Cost Estimation 101

Cost Estimation 101

 

Cost Estimation 101

Through industry leading tools and innovative strategies; Logapps provides Life Cycle Cost Estimates, Analysis of Alternatives, Cost Benefit Analysis, and Business Case Analysis to decision makers at all levels of the Federal government.

A quick overview of the Cost Estimation process can provide a fundamental background on how Logapps can fulfill your needs.

The "New CIO"

The "New CIO"

 

The "New CIO"

Over the last few decades, technology has become increasingly prominent within the business world, transforming the way businesses think, operate, and manage, beyond recognition.  The advances in information technology have revolutionized industries, altering not only the culture, but also the pace at which business is conducted. As innovation progresses, business leaders must find a way to adapt to the changing workplace environment in order to keep up with customer needs.  Already, Chief Information Officers (CIOs) are learning to be more strategic, recognizing the need to better integrate technology across business units within their organization. CIOs are now being acknowledged as key and equal members of the executive team because of the increasing influence of technology.

OPERATIONAL CIO VS. STRATEGIC CIO

An operational CIO is more or less a glorified, tech-centric Program Manager who focuses primarily on managing complex IT projects.  Although functional, these types of CIOs are not “game-changers” because they do not place an emphasis on the people, processes, and tools that can be leveraged to promote innovation throughout the organization.  Without an equipped advocate for innovation within the organization, companies cannot capitalize on their growth potential.  According to recent studies of federal IT portfolios, approximately 80% of the operational CIO’s budget goes towards maintaining current IT systems while the remaining 20% is dedicated towards developing and implementing new technological ideas. Unfortunately, when operational CIOs do uncover new technologies, many struggle to promote them due to their inability to converse in the business realm.

Strategic CIOs, on the other hand, speak the language of business and IT fluently. They are business executives first, and technology experts second, able to translate technical information into actionable insight that drives their business forward. Furthermore, Strategic CIOs focus on the customer’s needs and determine how current and new technologies can best address those needs. In designing and managing IT portfolios, they focus primarily on new efforts in Big Data, mobile, cloud, cyber, and social media, making sure they understand how to leverage these disciplines effectively in order to take advantage of their full potential. Since Strategic CIOs are adept in both the business and technical realms, they are capable of increasing business value, which benefits both the business and the customer.

HOW STRATEGIC CIOS PROSPER

Strategic CIOs are multi-talented, capable of adapting to the needs of the business. They have a wide variety of knowledge and skills and are always prepared to perform, whether as analysts, negotiators, or leaders. Thus, Strategic CIOs are especially valued for their hybrid talent in both the business and technology worlds. They are not simply businessmen and neither should they be categorized as technical staff. Instead, they serve as a hybrid of the two, capable of understanding technical information as well as business drivers and goals. With this unique ability to communicate with all levels of the business, Strategic CIOs possess the capability to effectively drive technology and innovation to increase business value.

To become a strategic CIO, operational CIOs must prove themselves through value added initiatives. Strategic CIOs are leaders in technology and advocates for their company’s technological investments. They drive progress and achieve success by promoting technology as a valuable asset to their company—a strategic CIO understands how new technology will fit into all parts of the business and develops an implementation plan to transition the business towards technological achievements. In addition, the ability to understand the impact of technology on business growth is a critical skill employed by strategic CIOs. In order for CIOs to determine how technology can benefit the company, they must be able to predict its return on investment (ROI). According to The Economist, new technologies are often abandoned because 37% of CIOs do not have the ability to predict its ROI. If a CIO cannot do this, he will not flourish in his field.

It’s strategically necessary that CIOs are knowledgeable of the latest technology, finding ways to integrate it into the company to improve productivity and efficiency. Additionally, since CEOs are a CIOs’ biggest supporters, a strategic CIO focuses on collaborating with the CEO to determine how technology can positively affect the company. Strategic CIOs are able to deal with change, possess good communication skills to advocate new ideas, and most importantly, understand the business so that technology can be used to fulfill company objectives. Therefore, with a strong business sense and an emphasis on technological innovation, strategic CIOs are revolutionizing the business world.

Independent Verification and Validation (IV&V) Through the Eyes of DoD

Independent Verification and Validation (IV&V) Through the Eyes of DoD

 

Independent Verification and Validation (IV&V) Through the Eyes of DoD

IT departments at different agencies and organizations across the government each have ways of conducting their testing and evaluation activities.  In the eyes of the U.S. Department of Defense, Independent Verification & Validation (IV&V) is an independent system assessment that analyzes and test the target system to 1) ensure that it performs its intended functions correctly, 2) ensure that it performs no unintended functions, and 3) measure its quality and reliability.

What is IV&V?

In the federal IT world it is often asked, “What is the difference between verification and validation?”  Simply put, verification ensures the software product is built correctly while validation ensures the right software product is built.  The intent of verification and validation is to improve the quality of the software during its life cycle, not afterwards, and must be performed as the software is being developed.  Federal organizations requiring very high level of accuracy in the estimation, design, construction, execution, and management of their IT programs have long used some form of independent verification and validation to assure software quality.   This process is sometimes used internally as a “sanity check.”

Independence

IV&V teams are independent of the development organization on a technical, managerial, financial, and contractual basis, but have well-established, working relationships with the development organization.  Early this year, the U.S. Department of Education published an IV&V handbook that stated:

  • Technical independence requires that IV&V personnel not be involved in any stage of the software requirements, design, or development process.
  • Managerial independence requires that IV&V responsibility be vested in an organization that is separate from the development and program management organizations. The independent selection of the artifacts to be examined and tested, the techniques to be used, the issues to be chosen, and the reporting to be made further affirm this independence.
  • Financial independence requires that the IV&V budget be vested in an organization independent from the development organization.
  • Contractual independence requires that the IV&V contract be executed separately from the contract for development.

The IV&V team will generate the test plans, test designs, test cases, and test procedures in preparation for IV&V testing.  This independent testing will complement rather than duplicate the development team’s testing.

Types of Testing

As a former Naval Sea System Command (NAVSEA) test engineer, the IV&V teams I had the pleasure of working alongside, 1 or 2 FTEs, conducting three primary test events when ensuring the software product was ready to move forward in the software acquisition life cycle.  The team makeup is different depending on the software being developed, resource capacity, and organizational experience but throughout the entire DoD, it’s customary to conduct three unique tests before the software goes into production:

  1. Regression Testing
  2. Functional Testing
  3. Non-functional testing

Depending on the organization’s capacity, experience, and system being developed, some IV&V teams conduct Interface testing but this is usually done at system level integration.

REGRESSION TESTING

Regression testing is defined as any type of testing that seeks to uncover new bugs or defects in existing functional and non-functional areas of a system. Regression testing is typically conducted after changes such as enhancements, patches or configuration changes may have changed the behavior of the system.

FUNCTIONAL TESTING

Functional testing is a combination of quality assurance and software testing.  Functional testing both “verifies a program by checking it against design document(s) and requirement(s)/specification(s)” (formally the quality assurance process), and by “validating the software against the published user or system requirements” (Software Testing).

NON-FUNCTIONAL TESTING

Non-Functional testing is the testing of non-functional requirements of a software application and includes a lot of mini-tests.  Defense organizations typically focus on five non-functional tests:  Stress, Endurance, Performance, Security, and Usability.

  1. Stress test is thorough testing used to determine the stability of a software product or system.  IV&V teams deliberately try to break the system.
  2. Endurance test is testing software for significant periods of time to discover how the software behaves under sustained use.
  3. Performance test is testing to determine how the software performs in terms of responsiveness and stability under a particular workload.  Performance testing is achieved throughout the testing cycle.
  4. Security test is a process to determine if the system is protecting its data and maintaining the functionality as intended.
  5. Usability test is a test used to evaluate a software application by testing it on users.

*System and Software are used interchangeably