8000 GitHub - AndreeaLupu2000/w10-template: Template Repository for W10 In-Class Exercise
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

AndreeaLupu2000/w10-template

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DevOps W10 In-Class Exercise Template

This repository contains a full-stack canteen application with a SvelteKit client, Spring Boot server, and LLM recommendation service. It demonstrates modern web application architecture and DevOps practices.

Project Overview

This project includes:

  • Client: SvelteKit with TypeScript, TailwindCSS, and reusable UI components for browsing canteen meals.
  • Server: Spring Boot Java application with RESTful APIs, gRPC communication, and PostgreSQL integration.
  • LLM Service: Python FastAPI service for generating meal recommendations using AI.
  • Database: PostgreSQL for storing user preferences and application data.
  • DevOps: Dockerized services, CI/CD pipelines, and production-ready deployment configurations.

Prerequisites

  • Node.js (v22 or later)
  • Java JDK 21+
  • Python 3.x
  • Gradle
  • Docker and Docker Compose
  • Git

Setup Instructions

Clone the Repository

git clone https://github.com/yourusername/w10-template.git
cd w10-template

Client Setup

  1. Navigate to the client directory:
    cd client
  2. Install dependencies:
    npm install

Server Setup

  1. Navigate to the server directory:
    cd server
  2. Build the project:
    ./gradlew build

LLM Service Setup

  1. Navigate to the llm directory:
    cd llm
  2. Install dependencies:
    python3 -m venv .venv
    source .venv/bin/activate
    pip3 install -r requirements.txt

Running the Application

Start the Database

docker compose up database -d

Start the Client

cd client
npm run dev

The client will be available at http://localhost:3000.

Start the Server

cd server
./gradlew bootRun

The server API will be available at http://localhost:8080.

Start the LLM Service

cd llm
python main.py

The LLM service will be available at http://localhost:5000.

Development Workflow

Client Development

  • Built with SvelteKit and TypeScript for a modern, reactive UI.
  • TailwindCSS for styling.
  • Components and routes are organized in the src directory.
  • Features meal browsing, favoriting, and user preferences.

Server Development

  • Built with Spring Boot for scalable and maintainable server services.
  • Includes gRPC communication with the LLM service.
  • PostgreSQL integration for user preferences storage.
  • RESTful APIs for canteen data and user management.
  • Gradle is used for dependency management and building.
  • Source code is in the src/main/java directory.
  • Tests are in the src/test/java directory.

LLM Service Development

  • Built with FastAPI for AI-powered meal recommendations.
  • Integrates with external LLM APIs for generating personalized suggestions.
  • Source code is in the llm directory.

Building for Production

Client Build

cd client
npm run build

Server Build

cd server
./gradlew clean build

Dockerized Deployment

The project includes Docker configurations for containerized deployment.

Build and Run with Docker Compose

  1. Build and start all services:
    docker compose up --build
  2. Access the application:

CI/CD Pipeline

The project includes GitHub Actions workflows for:

  • Building Docker Images: Automatically builds and pushes Docker images to GitHub Container Registry.
  • Deploying Docker Images: Deploys the application to a production environment using Docker Compose.

Project Structure

├── client/                  # SvelteKit client
│   ├── src/                 # Source code
│   ├── static/              # Static assets
│   └── package.json         # Client dependencies
│
├── server/                  # Spring Boot server
│   ├── src/                 # Source code including gRPC services
│   ├── build.gradle         # Gradle build file
│   └── Dockerfile           # Server Dockerfile
│
├── llm/                     # Python LLM service
│   ├── main.py              # FastAPI application
│   ├── requirements.txt     # Python dependencies
│   └── Dockerfile           # LLM service Dockerfile
│
├── docs/                    # API documentation (Bruno collection)
├── compose.yml              # Docker Compose for local development
└── .github/workflows/       # CI/CD workflows

API Documentation

API documentation is available in the docs/CanteenApp Bruno directory as a Bruno collection for testing endpoints.

License

This project is licensed under the MIT License.

About

Template Repository for W10 In-Class Exercise

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Java 43.2%
  • Svelte 20.6%
  • Python 13.7%
  • CSS 11.2%
  • TypeScript 6.0%
  • Dockerfile 3.5%
  • Other 1.8%
0