Index | Description |
---|---|
Project Overview | A detailed summary of this project solution and this repository. |
Demo | A video and a GIF recorded from our live demo. |
User Experience & Features | Overview of user roles and capabilities |
Technical Details | Include key technologies and an explanation for an AWS Architecture Diagram. |
Deploying & Running Ace It on AWS | A step-by-step deployment guide on how to deploy this project. |
User Guide | Overview of the user experience. |
Change Log | Any changes post publish. |
Credits | Meet the team behind the solution. |
License | License details. |
Ace It is an AI Study Assistant designed to help university students find and synthesize course content while providing actionable insights to instructors to improve course delivery. Integrated directly with learning management systems (LMS) via LTI and open source leveraging AI large language models (LLM), Ace It is able to access course information to provide Students contextual support in areas such as:
- Finding Course Information & Material References
- Providing Learning Recommendations (Tips & Suggested Materials)
- Solution Review & Feedback
- Problem Explanation
This repository contains the complete code for the project, including the frontend, backend, and AWS infrastructure.
Ace It currently only supports Canvas LMS, however, this project can be extended to support additional LTI-based LMS systems as needed.
Here is a link to our demo video.
Below is a quick demo gif of our deployed demo version of the solution.
Note that the quality of the demo gif is lower than actual due to gif compression.
Users may have one of two roles for each course they are associated with in Canvas, either "Student" or "Instructor". Users are not assigned roles for courses in Ace It, the solution instead uses the roles specified in Canvas.
All users are able to:
- Log-in using their Canvas authentication credentials
- See all of their courses in Canvas, separated between "Available Courses" (where the user is an Instructor or the Instructor has enabled Student access) and "Unavailable Courses" (where the Instructor has not enabled Student Access)
- Change their preferred language for Ace It (both AI responses and user interface)
With the Student role, users are able to:
- Start new conversation sessions or continue previous ones with the AI Study Assistant
- Ask questions to the AI Study Assistant in their preferred language
- Receive responses from the AI Study Assistant in their preferred language with references to Canvas
With the Instructor role, users are able to:
- View Ace It usage analytics for the course
- Number of questions asked
- Number of student sessions
- Number of students using Ace It
- Most common questions (determined using AI)
- Most referenced materials
- Configure the AI Study Assistant for the course, specifically:
- Enable Student access to the course (default no access)
- Select which course content to include from Canvas (e.g. Announcements, Syllabus, Files, etc.)
- Select which types of questions should be supported by the AI Study Assistant
- Configure custom response tone / style
- Test the AI Study Assistant for the course, sending messages like a Student would to evaluate the experience
The solution also features:
- Easy configurable theming and branding.
- Serverless AWS cloud deployment.
- Model agnostic LLM support.
- Canvas LMS LTI-based integration.
Ace It leverages the following key technologies:
- Frontend: React.js (Vite)
- Backend: Python 3
- Infrastructure: AWS CDK
Built with a Serverless architecture, Ace It is designed to be highly scalable and performant. The following diagram depicts the high-level AWS architecture:
- The user communicates with the web application hosted on AWS Cloudfront.
- Redirects to Canvas LTI for authentication, returns an authorization token when access is granted.
- The frontend app communicates with Amazon API Gateway for backend interactions.
- A Lambda function will periodically be triggered to retrieve course documents from Canvas API.
- To retrieve documents from Canvas API, use the token passed from Amazon API Gateway.
- The retrieved documents are stored in Amazon S3, which initiates a data ingestion workflow.
- A lambda function, integrated with LangChain, extracts text and metadata (size, date uploaded) from the stored documents in S3.
- The extracted data is embedded using Amazon Bedrock, specifically leveraging the Amazon Titan Text Embeddings v2 model to generate vector embeddings.
- These vector embeddings are stored in a PostgreSQL database. If users have a preferred language, it will be translated using Amazon Translate.
- Course management/assistant access can be configured by sending an API request which invokes a lambda function. The course configuration settings are restricted to instructors of that course.
- This lambda function interacts with Amazon RDS database.
- This lambda function generates an LLM response when students chat with the assistant and sends a query.
- Conversations and chat data are stored in Amazon RDS PostgreSQL.
- The assistant employs a Retrieval-Augmented Generation (RAG) architecture, combining with relevant course specific data to generate a response from the LLM.
- CloudFront fetches the frontend files from S3 bucket, and Serves cached frontend content globally.
The repository is organized into the following main directories:
.vscode/
: Contains Visual Studio Code configuration files.AceIT-ECE-Capstone/
: Root directory for the project..github/
: GitHub configuration files and workflows.workflows/
: Contains GitHub Actions workflows for CI/CD.
docs/
: Documentation files.frontend/
: Frontend codebase.public/
: Public assets for the frontend.src/
: Source code for the frontend.assets/
: Static assets like images, fonts, etc.components/
: Reusable React components.hooks/
: Custom React hooks.pages/
: Page components for different routes.services/
: API service calls.styles/
: CSS and styling files.utils/
: Utility functions and helpers.
lambda/
: AWS Lambda functions.layers/
: AWS Lambda layers.priv_ace_it_ece_capstone_main/
: Private directory for main project files.tests/
: Unit and integration tests for the project.
app.py
: Entry point for the backend application.cdk.context.json
: AWS CDK context configuration.cdk.json
: AWS CDK project configuration.README.md
: Project documentation.requirements-dev.txt
: Development dependencies for the backend.requirements.txt
: Production dependencies for the backend.source.bat
: Batch script for setting up the environment.
To deploy Ace It, you will need to first have the following pre-requisites:
- A Canvas LMS instance where you have the Admin role
- If you need to first set up a Canvas instance, please refer to the Canvas documentation here
- An AWS account with appropriate permissions for deployment
Deploying Ace It on your AWS account requires minor configuration in several areas across infrastructure, Canvas LMS, backend, and frontend.
To integrate this project with Canvas LMS, follow these steps:
- It is recommended to create a Canvas Admin Account for AceIt integration use only.
- Log in to your Canvas LMS instance as an Admin.
- Navigate through Admin > Site Admin > Settings > Admins
- Click + Account Admins
- Fill in your account info.
- Confirm the account details and Click Continue > OK
An access token is required to retrieve course content from Canvas.
- Log in to your AceIt Canvas Admin Account
- Navigate to Account > Settings.
- Scroll down to the Approved Integrations section.
- Click + New Access Token.
- Enter a purpose (e.g., "AceIt Integration") and click Generate Token.
- Copy the generated token and store it securely (this token will not be shown again).
To allow authentication with the Canvas account, create a Developer Key.
- Navigate to Admin > Developer Keys.
- Click + Developer Key > API Key.
- Fill in the required details:
- Key Name: (e.g., "AceIt Key")
- Redirect URIs: The CloudFront Distribution domain name set up in AWS Configuration steps.
- Scopes: Select appropriate API permissions for accessing course data. (Disable scope to allow access to all endpoints)
- Click Save Key and toggle it ON.
- Copy the Client ID and Client Secret.
To deploy Ace It on AWS, you'll need an AWS account with the appropriate permissions. This section covers two deployment methods:
- Method 1: Using GitHub Actions for CI/CD deployment.
- Method 2: Manual deployment without using GitHub Actions.
- AWS Account: Ensure you have an AWS account with administrative access.
- AWS Bedrock Request Access to: (In AWS Console > AWS Bedrock)
- Llama 3.3
- Amazon Text Embeddings Titan v2
- Clone this Github repository.
- Create an IAM role that GitHub Actions can assume to deploy resources on AWS.
1.1 Go to AWS Console > IAM > Create role, make sure you are in the desired region.
1.2 Under Select trusted entity, choose Web identity > Add Identity provider
1.3 Select OpenID Connect, Provider URL will be
token.actions.githubusercontent.com
, click Add Provider 1.4 Add the Identity Provider you just created, set Audience to bests.amazon.aws.com
. 1.5 Fill in the Github organization 1.6 Add permissions. For simplicity, we will attachAdministratorAccess
to allow GitHub Action to have full access to AWS services. 1.7 Give this role a name, and create this role. 1.8 Go to the Github repository > Settings > Secrets and Variables > Actions 1.9 Create secrets for:- AWS_ROLE_ARN: The ARN of the IAM Role created.
- AWS_REGION: Your preferred AWS region (e.g., us-west-2).
- AWS_ACCOUNT_ID: Your AWS Account ID.
1.10 Open the github workflows in .github/workflows/deploy.yml, replace the value of
role-to-assume
andaws-region
to be${{ secrets.AWS_ROLE_ARN }}
and${{ secrets.AWS_REGION }}
respectively. (i.e. comment out line 57, 58, uncomment line 55, 56, and delete line 25-28)
- Create an S3 bucket to host the frontend files.
2.1 Go to AWS Console > S3 > Create bucket > General Purpose, make sure you are in the desired region.
2.2 Give a name to the bucket, this is for the front end, so uncheck
Block all public access
2.3 Bucket Policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::myfrontendbucket3/*"
}
]
}
- Set Up CloudFront
3.1 Go to AWS Console > CloudFront > Create distribution, make sure you are in the desired region.
3.2 Create Distribution:
- Origin Domain Name: Select the S3 bucket created in step 2.
- Viewer Protocol Policy: Redirect HTTP to HTTPS
- Allowed HTTP Methods: GET, HEAD, OPTIONS, PUT, POST, PATCH, DELETE
- Cache Policy: Managed-CachingOptimized
- Origin Request Policy: CORS-S3Origin
- Response headers policy name: Managed-CORS-With-Preflight 3.3 After creating distribution, Copy the Distribution domain name. It'll be used for Access-Control-Allow-Origin.
- Modify .github/workflows/deploy.yml:
- replace
role-to-assume
to be${{ secrets.AWS_ROLE_ARN }}
aws-region
:${{ secrets.AWS_REGION }}
if you have not done so.
- replace
- follow instructions on this AWS Documentation to install AWS CLI and configure local AWS Credentials.
- Create an S3 bucket to host the frontend files.
2.1 Go to AWS Console > S3 > Create bucket > General Purpose, make sure you are in the desired region.
2.2 Give a name to the bucket, this is for the front end, so uncheck
Block all public access
2.3 Upload the all files in frontend to the S3 bucket created. 2.4 Bucket Policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::myfrontendbucket3/*"
}
]
}
- Set Up CloudFront, follow the steps in Method I step 3.
- Due to security considerations, it is the best practice to manually create Canavs Secrets. 1.1 Go to AWS Console > Secrets Manager > Store a new secret 1.2 Select secret type = Other type of secret 1.3 Under the Secret Value - Key/Value, enter:
Secret key | Secret value |
---|---|
apiKeyId | The API Developer Key Id displayed on Canvas |
apiKey | The API Developer Key obtained from Canvas |
baseURL | The base URL of the Canvas server |
redirectURI | The CloudFront Distribution domain name set up in AWS Configuration steps |
adminAccessToken | The Canvas Admin Access Token |
- Give this Secret a name. If the name is other than "
CanvasSecrets", please also change the secret name in get_canvas_secret.py. Replace
secret_name
with the name of your secret (e.g. "CanvasSecrets") - Once the necessary secrets have been configured in AWS Secrets Manager, you can deploy the backend code. There are two methods for deployment:
-
Method I. Using Github Action [Recommended] 3.1.1 Create a new GitHub repository (if not already done): Go to GitHub → New Repository → Set the repository name (e.g., AceIT-ECE-Capstone). 3.1.2 Commit and push your cloned repository to GitHub. 3.1.3 Ensure that GitHub Secrets are properly set up in AWS Configuration step 3.1.4 Trigger the GitHub Actions deployment - Push any changes to main to trigger the deployment workflow 3.1.5 If successful, the backend Lambda functions and infrastructure should be deployed. You can check the deployed stack url in GitHub Repository > Actions > Select the latest deployment workflow. 3.1.6 To prefix the deployed resources the github actions will deploy, simply modify the github action workflow file
deploy.yml
line 59 to benpx cdk deploy --require-approval never --context env_prefix=prefixlikedev
-
Method II. Manual Deployment Without GitHub Actions 3.2.1 If you prefer not to use GitHub Actions, follow these steps to deploy the backend manually using AWS CDK. 3.2.2 Install AWS CDK, python dependencies by running
npm install -g aws-cdk
,pip install -r requirements.txt
andpip install -r requirements-dev.txt
in terminal and make sure the working directory is at the root, i.e. ACEIT-ECE-CAPSTONE. 3.2.3 Make sure you've completed steps in AWS Configuration Method II to set up your AWS Credentials 3.2.4 Type in terminalcdk bootstrap
, then typecdk deploy --require-approval never
. If you want to prefix resources for development and production, simply typecdk deploy --require-approval never --context env_prefix=prefixlikedev
in the terminal instead. 3.2.5 If successful, there will be a deployed stack url displayed in your terminal. these will be your backend function urls.
- After deployment is successful, record the stack url (e.g. https://something.execute-api.us-west-2.amazonaws.com/prod/). This can be obtained by checking the
Deploy Stack
step if deploying using github action, or by checking the result of step 3.2.5 for no github action method. Replace theVITE_REACT_APP_API_URL
's value in frontend/.env with your actual stack url. Also modify theVITE_REACT_APP_HOST_URI
to your cloud distribution domain name, andVITE_REACT_APP_CANVAS_URL
to be your canvas url. Then, modify thelambda/utils/construct_response.py
'sAccess-Control-Allow-Origin
value to be your cloud distribution domain name. Save and deploy again. Now if you visit the Distribution domain name from cloudfront, you can visit AceIt!
- Important:
Please note that due to aws resource name constraints, the prefix should only contain lowercase letters. First time deploying will take ~25 minutes, please be patient. If your Canvas server has an TLS certificate, then replace all
verify=False
toverify=True
in this repository to enable secure communication.
There are two key types of frontend configuration. They are as follows:
- Configure theme and branding in
frontend/theme.ts
- Configure API destinations in
frontend/.env
Further details on what to configure in each of these files can be found within the files themselves.
To learn how to use the application, see the User Guide.
N/A
This application was architected and developed by SF-59: Zane Frantzen, Christine Jiang, Tony Li, Catherine Zhao, with guidance from our Capstone supervising professor Sidney Fels and supervising TA Hamidreza Aftabi.
Special thanks to the UBC Cloud Innovation Centre Technical and Project Management teams for their guidance and support.
This project is distributed under the MIT License.
Licenses of libraries and tools used by the system are listed below:
- For PostgreSQL and pgvector
- "a liberal Open Source license, similar to the BSD or MIT licenses."
LLaMa 3 Community License Agreement
For Llama 3 70B Instruct model