<--- Score
55. Is data collected and displayed to better understand customer(s) critical needs and requirements.
<--- Score
56. How do you manage unclear Open source appropriate technology requirements?
<--- Score
57. What was the context?
<--- Score
58. What critical content must be communicated – who, what, when, where, and how?
<--- Score
59. Are the Open source appropriate technology requirements complete?
<--- Score
60. Has a Open source appropriate technology requirement not been met?
<--- Score
61. What is the scope of the Open source appropriate technology effort?
<--- Score
62. What is the scope of Open source appropriate technology?
<--- Score
63. When are meeting minutes sent out? Who is on the distribution list?
<--- Score
64. How can the value of Open source appropriate technology be defined?
<--- Score
65. Is there a critical path to deliver Open source appropriate technology results?
<--- Score
66. Has/have the customer(s) been identified?
<--- Score
67. Do the problem and goal statements meet the SMART criteria (specific, measurable, attainable, relevant, and time-bound)?
<--- Score
68. What are the Open source appropriate technology use cases?
<--- Score
69. How do you gather Open source appropriate technology requirements?
<--- Score
70. Has a project plan, Gantt chart, or similar been developed/completed?
<--- Score
71. Has a team charter been developed and communicated?
<--- Score
72. Is there any additional Open source appropriate technology definition of success?
<--- Score
73. What Open source appropriate technology services do you require?
<--- Score
74. What knowledge or experience is required?
<--- Score
75. Are there different segments of customers?
<--- Score
76. How do you gather requirements?
<--- Score
77. How was the ‘as is’ process map developed, reviewed, verified and validated?
<--- Score
78. Have all of the relationships been defined properly?
<--- Score
79. What are the dynamics of the communication plan?
<--- Score
80. What are the rough order estimates on cost savings/opportunities that Open source appropriate technology brings?
<--- Score
81. Is there regularly 100% attendance at the team meetings? If not, have appointed substitutes attended to preserve cross-functionality and full representation?
<--- Score
82. What are the Roles and Responsibilities for each team member and its leadership? Where is this documented?
<--- Score
83. What is in scope?
<--- Score
84. What is the context?
<--- Score
85. Has your scope been defined?
<--- Score
86. How does the Open source appropriate technology manager ensure against scope creep?
<--- Score
87. Is the team equipped with available and reliable resources?
<--- Score
88. What are the boundaries of the scope? What is in bounds and what is not? What is the start point? What is the stop point?
<--- Score
89. Is Open source appropriate technology required?
<--- Score
90. What would be the goal or target for a Open source appropriate technology’s improvement team?
<--- Score
91. What information do you gather?
<--- Score
92. What sort of initial information to gather?
<--- Score
93. Are resources adequate for the scope?
<--- Score
94. Does the team have regular meetings?
<--- Score
95. How do you manage changes in Open source appropriate technology requirements?
<--- Score
96. Is the Open source appropriate technology scope manageable?
<--- Score
97. What are the Open source appropriate technology tasks and definitions?
<--- Score
98. What is out of scope?
<--- Score
99. How will the Open source appropriate technology team and the group measure complete success of Open source appropriate technology?
<--- Score
100. What specifically is the problem? Where does it occur? When does it occur? What is its extent?
<--- Score
101. How do you catch Open source appropriate technology definition inconsistencies?
<--- Score
102. Are task requirements clearly defined?
<--- Score
103. Who is gathering information?
<--- Score
104. How are consistent Open source appropriate technology definitions important?
<--- Score
105. How and when will the baselines be defined?
<--- Score
106. Has the improvement team collected the ‘voice of the customer’ (obtained feedback – qualitative and quantitative)?
<--- Score
107. What customer feedback methods were used to solicit their input?
<--- Score
108. Are approval levels defined for contracts and supplements to contracts?
<--- Score
109. Is there a completed SIPOC representation, describing the Suppliers, Inputs, Process, Outputs, and Customers?
<---