Project 2: Enhance Another Team's Minesweeper System
Course: EECS 581 Software Engineering II, Fall 2025
Due Date: 11:59 PM CDT, Friday, October 3, 2025
Overview
Teams will switch to another team's Minesweeper project from Project 1 to learn software maintenance principles. Fork the assigned repository and extend the system on a separate branch (not master). Do not contact the original team for assistance. Use the same language and platform as the inherited project. Refactoring and bug fixing are permitted to improve code structure without altering external behavior.
The assignment table (which team inherits which project) will be posted on Canvas by 11:59 PM CDT, Sunday, September 21, 2025. Original teams must link their repository and documentation to the new team but provide no further support.
Requirements
Project 1 Functionality
- Ensure all required features from Project 1 (game setup, gameplay, mine flagging, player interface, game conclusion) are fully operational, even if the inherited code was incomplete or buggy.
Artificial Intelligence (AI) Solver
AI setup: The AI operates on the same board configuration,
with turns alternating between player and AI if in interactive
mode, or solving automatically.
- Easy: AI uncovers cells randomly,
avoiding flagged or already uncovered cells.
- Medium: AI uncovers randomly until a
safe cell is revealed (zero adjacent mines), then uncovers
adjacent cells strategically using revealed numbers.
- Hard: AI "cheats" by always uncovering
a safe cell (non-mine), simulating perfect knowledge without
detonating mines.
Custom Addition
- Propose and implement one new feature to enhance the
game. Describe the feature using a UML diagram (e.g., class,
sequence, or use case) from EECS 348 concepts.
HINT: Suggestions and Ideas
- Timer for game duration with high-score tracking.
- Hint system revealing a safe cell (limited uses).
- Sound effects or animations for uncovering/mines.
- Difficulty levels adjusting mine density or grid size.
- Multiplayer mode for alternating turns.
Language and Platform
Use the same programming language(s) and platform as the inherited Project 1 team.
Submission Requirements
- Code Freeze: Freeze code on a separate branch of the forked repository by the due date. Compliance is based on the final commit timestamp.
- Post-Freeze Development: Continue work on other branches if needed, but do not update the submitted branch after the due date.
- Demo: Demonstrate the project during the weekly GTA/team meeting using the submitted branch as of the code freeze.
- Artifacts: Store all code and documentation in the forked GitHub repository on the submitted branch.
- Peer Reviews: Submit individual Team Peer Evaluation forms via Canvas by the due date.
Grading Criteria (100 Points)
1. Working Product Demonstration (40 Points)
- Platform: Conducted on a device of your choice during the weekly GTA/team meeting.
- Evaluation:
- Presence of all specified features
- Withstands stress testing (penalties for crashes or memory leaks)
- Intuitive interface, requiring no manual
2. System Documentation (40 Points)
Store in a "documentation" folder in the forked GitHub repository's submitted branch.
- Person-Hours Estimate (10 Points): Detail your methodology for estimated hours.
- Actual Person-Hours (10 Points): Day-by-day accounting of each member's hours (coding, testing, meetings, documentation, excluding EECS 581 lectures). No penalties for estimate deviations; penalties for incomplete or fabricated accounting.
- System Architecture Overview (20 Points): High-level description and diagram of system components, data flow, and key data structures, aiding future extensions.
3. Code Documentation and Comments (20 Points)
- Prologue Comments: Include for each file:
- Function, class, module name and brief description.
- Inputs and outputs.
- External sources (e.g., generative AI, StackOverflow) with attribution.
- Author's full name and creation date.
- In-Code Comments:
- Comment major code blocks and/or individual lines to explain functionality.
- Indicate whether code is original, sourced, or combined, using explanations in your own words.
- Ensure clarity for GTA and future teams, supplementing system documentation.
- Source Attribution:
- Clearly identify external code sources and rephrase comments distinctly.
- Failure to attribute sources constitutes academic misconduct (see the course syllabus).
Mandatory Peer Evaluation (-25 points if not completed)
Each team member must do a peer evaluation: Act as a manager and divide $10,000 bonus among team members (submit on Canvas).
Project Evaluation Rubric
1. Working Product Demonstration (40 Points)
- Exceeds Expectations (90–100%)
All specified features are present (Project 1 functionality, AI solver, custom addition); system is stable under stress testing; user interface is intuitive without requiring a manual; code is highly modular and extensible.
- Meets Expectations (80–89%)
Most specified features are present (at least two of: Project 1 functionality, AI solver, custom addition); system is mostly stable but may have minor issues under stress testing; user interface is mostly intuitive but may require minimal guidance.
- Unsatisfactory (0–79%)
One or fewer specified features are fully implemented; system crashes or has significant memory leaks; user interface is confusing or requires extensive guidance.
2. Estimate of Person-Hours (10 Points)
- Exceeds Expectations (90–100%)
Detailed methodology for estimating person-hours is complete, clear, and well-justified, enabling easy understanding by the GTA.
- Meets Expectations (80–89%)
Methodology for estimating person-hours is provided but lacks some clarity or detail, making it slightly difficult to understand.
- Unsatisfactory (0–79%)
No estimate provided (0 points); or estimate provided without any methodology or explanation (60 points).
3. Actual Accounting of Person-Hours (10 Points)
- Exceeds Expectations (90–100%)
Complete day-by-day accounting from each team member, detailing hours spent on coding, testing, meetings, and documentation (excluding EECS 581 lectures), with clear and accurate records.
- Meets Expectations (80–89%)
Incomplete day-by-day accounting from team members, or includes non-project time (e.g., EECS 581 lectures), or minor inaccuracies in reporting.
- Unsatisfactory (0–79%)
No accounting provided, or accounting is fabricated or significantly incomplete.
4. System Documentation (20 Points)
- Exceeds Expectations (90–100%)
Comprehensive system architecture overview and documentation in the GitHub repository's submitted branch, including detailed descriptions and diagrams of components, data flow, and key data structures, enabling future teams to easily extend the system.
- Meets Expectations (80–89%)
System documentation is mostly complete but missing minor details or lacks some clarity in describing components, data flow, or data structures, requiring slight effort from future teams.
- Unsatisfactory (0–79%)
System documentation is missing significant details, lacks diagrams, or is insufficient for future teams to understand and extend the system.
5. Code Documentation and Comments (20 Points)
- Exceeds Expectations (90–100%)
Prologue comments in each file include function/class/module name, description, inputs/outputs, external sources with attribution, author's name, and creation date; major code blocks and individual lines are clearly commented to explain functionality, with clear attribution for original, sourced, or combined code.
- Meets Expectations (80–89%)
Prologue comments are present but missing some required elements (e.g., inputs/outputs or attribution); some major code blocks or individual lines lack comments, or attribution is incomplete.
- Unsatisfactory (0–79%)
Prologue comments are missing entirely, or major code blocks and lines have minimal or no comments, or external sources are not attributed, risking academic misconduct.