Automated Assessment tool Business Requirments
December- January 2024
Cardiff University- Group and Individual Project
Following the creation of the initial Lean Canvas for the automated assessment tool, my team and I progressed to the next phase of the project: developing a comprehensive set of system requirements and a high-level design. This phase was essential in transitioning from conceptual ideas to actionable plans, forming the foundation for a functional tool designed to streamline the assessment process.
The automated assessment tool aimed to support teaching staff in managing summative and formative assessments more efficiently, ultimately enhancing student performance through improved grading and feedback systems. By exploring top-level use case diagrams, detailed user stories, and non-functional requirements, we ensured the system was both user-focused and technically feasible. This phase emphasized the importance of aligning user needs, business goals, and technical constraints to create a scalable and impactful solution.
Top-Level Use Case Diagram
Our work began with creating a Top-Level Use Case Diagram that outlined the system’s main features, focusing on user interactions such as submitting assessments, quickly marking student work, and providing prompt feedback.
Non-Functional Requirements
We also created a set of Non-Functional Requirements that defined critical aspects of the system’s performance, such as security, scalability, usability, and accessibility. These specifications ensured the system would meet the needs of both educators and students while maintaining reliability and ease of use across different devices and platforms.
Security:
The system shall implement secure user authentication for both lecturers and students. Access to sensitive data, such as student records and assessment questions, shall be restricted based on user roles and permissions.
All communication between the client and server shall be encrypted to prevent unauthorised access or data tampering.
The system shall provide audit trails to track changes made by users, including modifications to assessment questions and student records.
Validation: Conduct penetration testing to identify and address potential vulnerabilities in the authentication mechanism and data access controls.
Scalability:
Two weeks before the assessment deadline, when the number of students using the system can be as high as 90% of the cohort, the system should efficiently manage and handle the increased load.
Resources scaling should include but not limited to computational resources, storage and network bandwidth to meet increased demand.
Validation: Perform load testing simulations with a high volume of concurrent users to verify that the system can handle peak usage around assessment deadlines.
Reliability:
The system shall ensure high availability, aiming for at least 98% uptime during critical periods, such as assessment deadlines.
Critical functionalities, such as accessing assessment materials, submitting assignments, and viewing grades, shall be available offline with seamless synchronisation once connectivity is restored.
Validation: Implement failover and disaster recovery tests to simulate unexpected failures and verify that the system can gracefully recover without data loss or service interruption.
Usability:
The system should provide clear instructions for creating, modifying and taking assessments or when any finalisations are made e.g. submitting assessments.
Common actions, such as creating assessments, adding questions and accessing results, shall be easily accessible and prominently displayed.
Validation: Conduct usability testing sessions with representative users to gather feedback on the intuitiveness and clarity of the interface.
Portability:
The system shall be compatible with major web browsers (e.g., Chrome, Firefox, Safari) to ensure access from various devices and platforms.
The system should be compatible with various operating systems e.g. Windows, macOS and Linux, allowing access from both University issued laptops and personal machines.
It should be responsive, adapting to different screen sizes and resolutions allowing students without laptops to still have access.
Validation: Test the system across different web browsers and operating systems to ensure compatibility. Use responsive design testing tools to verify that the interface adapts seamlessly to various screen sizes and resolutions.
Compatibility:
The system shall be compatible with Learning Management Systems (LMS) such as Learning Central.
Validation: Confirm interoperability with Learning Central by conducting integration tests. Ensure that data exchange and functionality remain intact.
Extendibility:
The system architecture should be designed to easily accommodate future extensions, as identified through client sessions or evolving requirements.
The system shall be designed to allow for the seamless integration of a plagiarism checker module.
The system shall include configurable settings to enable or disable the plagiarism checker functionality.
Validation: Implement a proof-of-concept integration with a plagiarism checker module to demonstrate seamless integration capabilities.
Accessibility:
Users shall have the ability to adjust colours, contrast and font sizes to enhance readability and usability.
The system shall allow for the incorporation of university approved assistive technologies such as screen readers and speech recognition software to ensure accessibility for users with disabilities.
Keyboard navigation shall be fully supported, allowing users to navigate through the system and interact with all features.
Validation: Conduct accessibility audits with assistive technologies to ensure compliance with accessibility standards. Gather feedback from users with diverse accessibility needs to assess the effectiveness of support.
User Story-
Each team member then developed detailed User Stories with Acceptance Criteria for specific functionalities, ensuring that every feature was both user-focused and measurable. My user story was for the teaching team and focused on the ‘set and upload assessments and rubric’ use case. I also created an acceptance criteria to assess future prototype.
“As a member of the teaching team, I want to efficiently set and upload assessments along with corresponding rubrics, so that I can streamline the assessment process and ensure clarity and consistency in grading.”
Acceptance Criteria-
Accessing Assessment Creation Interface-
Upon successful login, the teaching team should be able to easily access the assessment creation interface.
Creating Assessments-
Once in the assessment creation interface, the teaching team must be able to create a new assessment and be able to input all necessary details for assessment including title, description, due date, and instructions.
The teaching team should have the capability to upload files associated with the assessment, including documents, images, or multimedia content.
The system must allow the teaching team to define the assessment criteria and grading rubric according to the requirements of the task.
The teaching team should have the flexibility to customize the rubric by adding, editing or removing criteria, descriptors and weightage as needed.
Planning Assessments-
The system must offer options for saving assessments as drafts or publishing them for student access, with clear labelling for ease of use.
The system should allow the teaching team to create assessments ahead of time and set a specific publishing date when assessment will be automatically published.
Uploading Assessment-
The system must provide a preview feature to allow the teaching team to review how the assessment and rubric will appear to students.
The system must give the teaching team the option to add any announcements or messages that will be automatically sent to student once the assessment is published.
Upon saving or publishing, the teaching team should receive a confirmation message indicating the successful creation/upload of the assessment and rubric.
When an assessment is published, students should receive a notification along with the any announcements or messages from the teaching team.
Editing Assessment-
The teaching team should be able to search for specific assessments and rubrics using keywords or filters for quick access.
The system should enable the teaching team to review and edit assessments and rubrics before finalizing them, including modifying assessment details, rubric criteria, and any uploaded files.
When a published assessment is edited, students who have already accessed or started the assessment should receive a notification regarding the changes made, ensuring transparency and clarity for students.
Security and Authorisation:
Uploaded assessments and rubrics must be securely stored and easily retrievable for future reference or modification, ensuring the integrity and confidentiality of data.
The system should enforce role-based permissions to ensure that only authorised teaching team members have the ability to create, upload, and modify assessments and rubrics.
Key Takeaways
This project was instrumental in helping me develop a structured approach to gathering and defining requirements, a critical step in any successful UX process. Here are my key takeaways:
Understanding the User Needs: The process of breaking down user interactions into detailed use cases and user stories taught me how to thoroughly consider the user's journey and needs. This ensures that the design phase is grounded in features that provide real value to users.
Clarity in Requirements: Developing clear, specific requirements—both functional and non-functional—helped me understand the importance of precision in defining what a product needs to achieve. This clarity reduces ambiguity during the design phase, allowing for more focused, user-centered solutions.
Importance of Usability and Accessibility: Defining usability and accessibility requirements reinforced the idea that a successful UX design must be inclusive and intuitive for a wide range of users. This focus has influenced how I approach designing interfaces that are easy to navigate, regardless of user ability or device.
This project strengthened my ability to translate business and user needs into actionable design requirements, a skill I now apply to every UX project I undertake.
Next Steps
The next step in this project was the prototyping phase, where we transitioned from planning and requirements gathering to the implementation of a functional prototype for the automated assessment tool. This phase aimed to bring the concepts and designs from the requirements phase to life through an iterative development process.
During prototyping, our team divided the system into manageable modules, allowing each member to take lead responsibility for a specific feature. Collaboration was key to managing dependencies and ensuring seamless integration across different components. Regular testing and iterative feedback loops helped us refine the system, improving its functionality and alignment with user needs.
This phase not only focused on delivering a working prototype but also emphasized meeting software quality criteria such as usability, reliability, and maintainability. By creating and demonstrating a functional version of the tool, we were able to validate our ideas and identify areas for further development and improvement in future iterations.