Sie sind auf Seite 1von 18

Product Quality

Schedule Variation
Review Effectiveness Requirements
Review Effectiveness Design
Review Effectiveness Coding
Cost of Quality
Defect Density
Defects / Hr
Post Release Defect Density
Post Release Defects / Hr
Defect Detection Effectiveness (DDE)
Defect Acceptance Ratio
Resolution Effectiveness
Defect Count / Sprint (or Story Point)
Process Performance
Estimate Variation
Productivity Development
Resource Mix Index
Customer Satisfaction Index
Requirements Stability Index
Productivity - Test Case Development
Productivity - Test Execution
Productivity Support
Test Coverage
Test Design FV2 Coverage
Test Execution FV2 Coverage
Ambiguity Density
Unresolved Ambiguity Ratio (UAR)
Average Acknowledgement Time
Average Resolution Time
On-Time Index
Backlog Index
Dependency Index
Sprint Velocity
Sprint Burn Down Rate
Metrics
(Actual End Date - Planned End Date) / (Planned
Duration) x 100
# of Requirement defects found during
Requirements Review / Total Number of
Requirements Defects Detected
# of Design defects found during Design Review /
Total Number of Design Defects Detected
# of defects found during the Code Review / Total
Number of Coding Defects Detected
Effort spent on Review, Testing, Rework, and
Prevention / Total effort spent in the project
(Number of Defects Reported and Accepted by
Review and Test Teams) / (Size of Software
Reviewed or Tested)
Number of Defects Reported against completed
deliverables for the week / Planned hours for the
completed deliverables for the week
# of post release defects / Size of the application
# of Post Release Defects / Planned Hours
(Number of Defects Reported prior to delivery and
Accepted) / (Number of Defects Reported prior to
delivery and Accepted + Number of Defects
Reported after delivery and Accepted) x 100
(Number of Defects Accepted as Valid) / (Number of
Defects Reported by Test Team)
(Number of Problem Tickets Reported as Resolved
by the Support Team and Closed during the Period -
Number of Problem Tickets Reopened during the
Period) / (Number of Problem Tickets Reported as
Resolved by the Support Team and Closed during the
Period)*100
(Number of Defects reported / Total Number of
Story Points / Backlog Items Completed)
(Planned Effort - Actual Effort) / (Planned Effort) x
100
(Size of Software)/ (Effort Spent in Producing the
Software)
( 1 x of Resources - Jr + 2 x # of Resources - Mid + 3
x # of Resources - Senior1 + 4 x # of Resources -
Senior2 ) / Number of resources
NPS Score - From customer satisfaction survey done
by the project team
(1 (# of requirements added, changed and deleted)
/total # of requirements) * 100
(Number of Test Cases Developed) / (Effort for Test
Case Development)
(Number of Test Cases Executed) / (Effort for Test
Case Execution)
(Number of Problem Tickets Reported as Resolved
by the Support Team- Number of Problem Tickets
Reopened) / (Number of Person Hours of Effort)
(Number of Requirements Covered in Test
Execution) / (Number of Functional Requirements in
Approved Requirements Spec) * 100
Percentage of known Functional Variations (FV) that
are covered in test design
Number of FV's Covered in Test Execution / Number
of FV's in all Logic Models
Number of Ambiguities Reported / Number of Text
Pages of Requirements Reviewed for Ambiguities
Number of Ambiguities Unresolved / Number of
Ambiguities Raised
(Sum of Acknowledge Times for all Problem Tickets)
/ (Number of Problem Tickets Received)
(Sum of Resolution Times for all Problem Tickets) /
(Number of Problem Tickets Resolved)
(Number of Problem Tickets Resolved On Time) /
(Number of Problem Tickets Resolved) x 100
(Number of Tickets Resolved in the period) /
(Number of Tickets Open at the Beginning of the
period + Number of tickets received during the
period) *100
((Number of Problem Tickets Resolved till date by
the Support Team without assistance from client) /
(Total Number of Problem Tickets Resolved by the
team till date))*100
(Total Number of Story Points (or Backlog Items
completed) / Total Number of Sprints Completed)
Sum of backlog effort perceived on any day/ Sum of
backlog effort planned on any day
Number of days deviated from the committed date
expressed as a percentage
Ratio of Number of defects found during the review of
Requirements to the total number of Requirement Defects
Ratio of Number of defects found during the review of
Design to the total number of Design Defects
Ratio of Number of defects found during the review of Code
to the total number of Coding Defects
Ratio of the effort spent for ensuring quality to the total
effort spent on the project
Number of defects reported and accepted per unit size of
the software during development
Number of defects reported per planned hours for the
deliverable
Number of Defects Report by customer / Size Delivered
Number of defects reported by client per hour
Percentage of the total number of defects reported for the
application that are reported prior to delivery
Percentage of defects reported that are accepted as valid
Percentage of resolved and closed problem tickets that are
not re-opened during a period
Defects reported against the total number of Story Points or
Sprint Backlog Items complete
Ratio of difference between Planned Effort and Actual Effort
to the Planned Effort expressed as a percentage
Size of software produced per person hour of effort
Average of weighted score of the resources in each category
The rating on a scale of 0 -10 on the question How likely is
that you would recommend our services to a friend or a
colleague?
Ratio of the total number changes in requirements to the
total number of requirements
Number of Test Cases developed per person hour of effort
Number of Test Cases executed per person hour of effort
Number of problem tickets that are resolved per person
hour of effort
Percentage of requirements that are covered in test
execution
Number of FVs Covered in Test Design / Number of FV's in all
Logic Models
Percentage of known FVs that are covered in test execution
Number of ambiguities per page of written requirements
text
Percentage of Ambiguities reported that remain unresolved
Average time taken to acknowledge a problem ticket
Average time taken to resolve a problem ticket
Percentage of problem tickets that are resolved within the
maximum allowed Resolution Time
Percentage of incoming problem tickets that remain open at
the end of the period
Percentage of problem tickets resolved and closed by the
project team without seeking assistance from client
personnel
Number of story points completed in each sprint
Actual / Perceived Sprint Backlog Effort
Lag Indicator for schedule slippage, customer satisfaction and
competency gaps in project management
Lead indicator for employee dissatisfaction
Indicates the effectiveness of our quality control activities
Lag indicator for effectiveness of Verification, Customer Satisfaction
and Competency gaps of verification team
Indicates the effectiveness of our quality control activities
Lag indicator for effectiveness of Verification, Customer Satisfaction
and Competency gaps of verification team
Indicates the effectiveness of our quality control activities
Lag indicator for effectiveness of Verification, Customer Satisfaction
and Competency gaps of verification team
Lag indicator towards the profitability
Lag indicator towards the competency gaps in the team
Financial Cost Reduction
Customer Customer Satisfaction
Learning, Innovation and Growth Competency Development
Financial Cost Reduction
Customer Customer Satisfaction
Learning, Innovation and Growth Competency Development
Lag indicator to the quality of service provided and customer
satisfaction
Can be used to compare with organization and industry benchmarks
Lag indicator to the quality of service provided and customer
satisfaction
Lead indicator to project slippage and poor profitability
Indicates the effectiveness of our quality control activities
Lag indicator for effectiveness of V&V, Customer Satisfaction and
Competency gaps of V&V team
Low value indicates poor reviews or testing
Financial Profit Growth
Lag indicator to the issues in the problem management process,
competency gaps and customer satisfaction
Lead indicator to employee stress levels
Lag indicator to the effectiveness of verification and validation
process
Lead indicator for strengthening the verification process in
subsequent sprints
Lag indicator of competency gaps in the project team
Lag indicator to employee dissatisfaction, customer dissatisfaction,
schedule variance and competency gaps
Benchmark our capability
Lead indicator to quality of service / product, employee satisfaction
and customer satisfaction
Lag indicator to competency gaps
Lead indicator towards the profitability of the project
Lag indicator towards customer satisfaction
Can introduce corrective measures based on the score
Lag indicator towards schedule slippage and budget over run
If the change requests are not managed, would lead to customer
dissatisfaction
Lead indicator for employee dissatisfaction
Lead indicator of schedule slippage
Lead indicator of schedule slippage, employee satisfaction and
customer satisfaction
Lead indicator to quality of service / product, employee satisfaction
and customer satisfaction
Lag indicator to competency gaps
Lead indicator of quality of deliverable
Lead indicator of quality of deliverable and risk to profitability of the
project
Lead indicator of quality of deliverable and risk to profitability of the
project
Direct measure of requirement quality
Financial Profit Growth
Lead indicator of quality of deliverables
Lag indicator of teams competence, stability of the application
Expectation would be a steady improvement
Lag indicator of teams competence, stability of the application
Expectation would be a steady improvement
Customer Customer Satisfaction
Lag indicator to the issues in knowledge transfer process,
competency gaps and customer satisfaction
Lead indicator to project profitability and customer satisfaction
Lag indicator of our productivity
Lead indicator for subsequent sprints to introduce measures to
improve productivity
Demonstrate increase in productivity to the customer
Lead indicator on whether we will meet the scheduled dates
Functionality Reliability Usability Efficiency
suitability maturity understandability time behaviour
accuracy fault tolerance learnability resource utilisation
interoperability recoverability operability
security attractiveness
Value
Simplcity
Volume
Complexity per unit
Duplication
Unit size
Unit testing
Maintainability
analysability
Algorithm Complexity
Self Descriptiveness
Modularity
Structuredness
Consistency
changeability
Structuredness
Modularity
Packaging
stability
testability
Value
Simplcity
LOC, Man months based on FP
Cyclomatic complexity, fan-in, fan-out,
coupling, and stability
measures
Volume Duplicated blocks of more than 6 lines
Complexity per unit Lines of code per uni
Duplication Unit test coverage
Unit size
Unit testing
Portability
adaptability
installability
co-existence
replacability
ISO/IEC 25010
Functionality
Suitability Reliability
Performance
efficiency Usability
Functional
completeness Maturity Time behaviour
Appropriateness
recognisability
Functional
correctness Availability Resource utilisation Learnability
Functional
appropriateness Fault tolerance Capacity Operability
Recoverability User error protection
User interface
aesthetics
Accessibility
Maintainability Security Compatibility
Modularity Confidentiality Co-existence
Reusability Integrity Interoperability
Analysability Non-repudiation
Modifiability Accountability
Testability Authenticity
Portability
Adaptability
Installability
Replaceability

Das könnte Ihnen auch gefallen