-
Notifications
You must be signed in to change notification settings - Fork 22
Product and UX Resources
Links in this doc are restricted.
- e-QIP/ vision and roadmap (2017 slides)
- Investigations process overview (starts on page 3)
- Collected project resources page
- DISA Project Glossary
- List of new feature ideas (slides)
- Initial 2017 Mapping of some process journeys after talking with those users and visiting several Army/DOD processing facilities. (pdf)
- Customer Journey Map (pdf) (sketch)
- Customer Communication Opportunities (pdf)
- Interviews with current eQIP agency users and their themes and pain points. (mural)
- UX Research Update (slides)
- Collected Insights from Agency users (mural)
- Collected Insights from Applicant users (mural)
- Accessibility Considerations (doc)
- Timeline sections and restyled form components (docs and videos)
- Updated timeline validation (Round 1 docs and videos)(Round 2 docs and videos)
- Review workflows (docs and videos)
- Content reviewed/rewritten - includes rewritten questions that would require a policy change to implement (doc)
- List of questions, their help text and error text. Contains highlights and comments on things that seemed weird (spreadsheet )
- Comprehensive form tracking spreadsheet. This sheet is used to track a comparison of content between all 3 major form types (86, 85, 85P) as well as error messages and help text. (Spreadsheet)
- Templates for sprint review folder
- Example Slides
- Mural
- Participant Spreadsheet (restricted access)
Testing Notes: In the past we’ve done a mix of usertesting.com and in person or video usability testing (roughly 5-7 users a sprint). Occasionally this has included people who use screenreaders but we have not done a large push to research this audience. Users and devices have not been part of research segmentation in the past, but its recommended that in the future there is very intentional and segment by device, usability, technical competency, contractor vs. federal hire, pc vs mac, etc.
User Acceptance Testing (UAT): In addition to usability testing, UAT testing has been conducted throughout the development process. UAT testing is less focused on the usability of the system and more focused on finding bugs or regressions in the software. This type of testing is done less frequently than usability testing. Issues or bugs found during UAT are captured and tracked in an internal NBIS tracking tool. In 2018 and 2019 OPM and the Army have done a majority of the testing. In the summer of 2019 more agencies are being onboarded to test as part of the full Investigation Management testing process.
This qualitative feedback form could always be attached on final page in the system after an applicant submits their form for continued qualitative data.