<--- Score
60. What happens if Digital customer experience’s scope changes?
<--- Score
61. How do you gather the stories?
<--- Score
62. How often are the team meetings?
<--- Score
63. What Digital customer experience requirements should be gathered?
<--- Score
64. How do you hand over Digital customer experience context?
<--- Score
65. Who defines (or who defined) the rules and roles?
<--- Score
66. What information do you gather?
<--- Score
67. Is the team formed and are team leaders (Coaches and Management Leads) assigned?
<--- Score
68. Is there a completed, verified, and validated high-level ‘as is’ (not ‘should be’ or ‘could be’) stakeholder process map?
<--- Score
69. How do you gather Digital customer experience requirements?
<--- Score
70. Is Digital customer experience currently on schedule according to the plan?
<--- Score
71. How do you manage unclear Digital customer experience requirements?
<--- Score
72. What sources do you use to gather information for a Digital customer experience study?
<--- Score
73. Are audit criteria, scope, frequency and methods defined?
<--- Score
74. What are the requirements for audit information?
<--- Score
75. Is the Digital customer experience scope manageable?
<--- Score
76. When are meeting minutes sent out? Who is on the distribution list?
<--- Score
77. Are there any constraints known that bear on the ability to perform Digital customer experience work? How is the team addressing them?
<--- Score
78. Do you all define Digital customer experience in the same way?
<--- Score
79. What are the boundaries of the scope? What is in bounds and what is not? What is the start point? What is the stop point?
<--- Score
80. Who approved the Digital customer experience scope?
<--- Score
81. How will the Digital customer experience team and the group measure complete success of Digital customer experience?
<--- Score
82. The political context: who holds power?
<--- Score
83. How would you define Digital customer experience leadership?
<--- Score
84. Is the improvement team aware of the different versions of a process: what they think it is vs. what it actually is vs. what it should be vs. what it could be?
<--- Score
85. What scope to assess?
<--- Score
86. What gets examined?
<--- Score
87. Will a Digital customer experience production readiness review be required?
<--- Score
88. Does the team have regular meetings?
<--- Score
89. Why are you doing Digital customer experience and what is the scope?
<--- Score
90. Is it clearly defined in and to your organization what you do?
<--- Score
91. Is there regularly 100% attendance at the team meetings? If not, have appointed substitutes attended to preserve cross-functionality and full representation?
<--- Score
92. How does the Digital customer experience manager ensure against scope creep?
<--- Score
93. Will team members perform Digital customer experience work when assigned and in a timely fashion?
<--- Score
94. What are the dynamics of the communication plan?
<--- Score
95. Is the Digital customer experience scope complete and appropriately sized?
<--- Score
96. When is the estimated completion date?
<--- Score
97. Who are the Digital customer experience improvement team members, including Management Leads and Coaches?
<--- Score
98. Is the team adequately staffed with the desired cross-functionality? If not, what additional resources are available to the team?
<--- Score
99. Are roles and responsibilities formally defined?
<--- Score
100. Have specific policy objectives been defined?
<--- Score
101. How are consistent Digital customer experience definitions important?
<--- Score
102. Is there a critical path to deliver Digital customer experience results?
<--- Score
103. How is the team tracking and documenting its work?
<--- Score
104. How would you define the culture at your organization, how susceptible is it to Digital customer experience changes?
<--- Score
105. How do you think the partners involved in Digital customer experience would have defined success?
<--- Score
106. Are team charters developed?
<--- Score
107. Are all requirements met?
<--- Score
108. How will variation in the actual durations of each activity be dealt with to ensure that the expected Digital customer experience results are met?
<--- Score
109. What is the context?
<--- Score
110. How do you manage changes in Digital customer experience requirements?
<--- Score
111. How and when will the baselines be defined?
<--- Score
112. How did the Digital customer experience manager receive input to the development of a Digital customer experience improvement plan and the estimated completion dates/times of each activity?
<--- Score
113. Are different versions of process