e-motion2.0 is a project designed to analyze emotions from facial features (image data). The project includes Python scripts, HTML templates, CSS styles, and a pre-trained model to deliver accurate emotion detection.
- Emotion detection from images
- Frontend built with HTML, CSS, and JavaScript
- Backend powered by Python Flask and a pre-trained model
e-motion/
│
├── client/
| ├── static/
| | └── app.js # Static JavaScript file
| ├── styles/
| | └── main.css # CSS file
| └── templates/
| └── index.html # HTML template
|
├── server/
| ├── pythonScripts/ # User-defined python package
| │ ├── __init__.py
| │ ├── clickPhoto2.py # Script to click photo and save
| │ ├── preProcess.py # Script to pre-process data (image file)
| │ └── predict.py # Script to analyze data using pre-trained model
| ├── model2.h5 # Pre-trained model
| ├── requirements.txt # Python dependencies
| └── server.py # Main server script
|
├── .gitignore # gitignore file
|
└── README.md # You are here :)
- Python 3.x
- Flask
- Clone the repository:
git clone https://github.com/PALLADIUM26/e-motion.git
- Navigate to the project directory:
cd e-motion2.0
- Install dependencies:
cd server pip install -r requirements.txt
- Navigate to
http://localhost:5000
in your browser to start using e-motion. - Run the server on terminal:
cd server python server.py
Contributions are welcome! Feel free to submit issues or pull requests.