NeuroFeel: Lightweight Emotion Detection for Edge AI

Author by BoltUIX Team in AI & Machine Learning June 12, 2025 85
NeuroFeel Banner

Overview

NeuroFeel is a lightweight NLP model built on NeuroBERT, fine-tuned for short-text emotion detection. With a ~25MB size and ~7M parameters, it classifies text into 13 emotional categories (e.g., Happiness πŸ˜„, Sadness 😒, Love ❀️). Optimized for edge AI, IoT, and mobile apps, it delivers real-time, offline emotion analysis for chatbots, social media, mental health, and wearable devices.

NeuroFeel brings nuanced emotion detection to edge devices with privacy and efficiency.

BoltUIX Team, AI Innovation 2025

Key Features

  • Ultra-Compact: ~25MB footprint for low-storage devices.
  • Rich Emotions: Detects 13 emotions with emoji mappings.
  • Offline: No internet required.
  • Real-Time: <40ms latency on Raspberry Pi 4.
  • Privacy-First: On-device processing.
  • Versatile: Supports emotion detection, sentiment, and tone analysis.

Model Training Tutorial

Watch this step-by-step guide to train your machine learning model:

Training Tutorial

Supported Emotions

EmotionEmoji
Sadness😒
Anger😠
Love❀️
Surprise😲
Fear😱
HappinessπŸ˜„
Neutral😐
Disgust🀒
ShameπŸ™ˆ
GuiltπŸ˜”
ConfusionπŸ˜•
DesireπŸ”₯
Sarcasm😏

Model Architecture

  • Layers: 4 transformer layers
  • Hidden Size: 256
  • Attention Heads: 8
  • Parameters: ~7M
  • Quantization: INT8
  • Vocabulary Size: 30,522 tokens
  • Max Sequence Length: 64 tokens

Installation


pip install transformers torch
                        

Requires Python 3.6+, ~25MB storage.

Quickstart: Emotion Detection

Basic Inference


from transformers import pipeline

# Load model
sentiment_analysis = pipeline("text-classification", model="boltuix/NeuroFeel")

# Analyze emotion
result = sentiment_analysis("i love you")
print(result)
                        

Output: [{'label': 'Love', 'score': 0.8563215732574463}]

Extended Example with Emojis


from transformers import pipeline

# Load model
sentiment_analysis = pipeline("text-classification", model="boltuix/NeuroFeel")

# Emoji mapping
label_to_emoji = {
    "Sadness": "😒", "Anger": "😠", "Love": "❀️", "Surprise": "😲", "Fear": "😱",
    "Happiness": "πŸ˜„", "Neutral": "😐", "Disgust": "🀒", "Shame": "πŸ™ˆ", "Guilt": "πŸ˜”",
    "Confusion": "πŸ˜•", "Desire": "πŸ”₯", "Sarcasm": "😏"
}

# Input text
text = "i love you"

# Analyze emotion
result = sentiment_analysis(text)[0]
label = result["label"].capitalize()
emoji = label_to_emoji.get(label, "❓")

# Output
print(f"Text: {text}")
print(f"Predicted Emotion: {label} {emoji}")
print(f"Confidence: {result['score']:.2%}")
                        

Output: Text: i love you
Predicted Emotion: Love ❀️
Confidence: 85.63%

Dataset Download

Access the Emotions Dataset to enhance your AI models:

Emotions Dataset

Use Cases

  • Chatbots: Tailor responses for emotions like β€œLove ❀️” or β€œSadness πŸ˜’β€.
  • Social Media: Tag posts for moderation, e.g., β€œDisgust πŸ€’β€.
  • Mental Health: Monitor mood, e.g., β€œSadness πŸ˜’β€ for wellness apps.
  • Smart Replies: Suggest emojis for β€œHappiness πŸ˜„β€.
  • IoT Devices: Adjust settings for β€œFear πŸ˜±β€ or β€œAnger πŸ˜ β€.
  • Voice Assistants: Parse emotions locally.
  • Toy Robotics: Emotion-driven interactions.
  • Wearables: Track mood, e.g., β€œFear πŸ˜±β€ for breathing exercises.
  • Smart Homes: Adjust lighting for β€œSadness πŸ˜’β€.
  • Customer Support: Escalate for β€œAnger πŸ˜ β€.
  • Education: Tailor explanations for β€œConfusion πŸ˜•β€.

Hardware Requirements

  • Processors: CPUs, NPUs, microcontrollers (e.g., ESP32-S3, Raspberry Pi 4)
  • Storage: ~25MB
  • Memory: ~70MB RAM
  • Environment: Offline or low-connectivity

Training Details

  • Dataset: ~10,000 custom samples + ChatGPT-augmented data
  • Training: 5 epochs, batch size 16, learning rate 2e-5, AdamW optimizer
  • Hardware: NVIDIA A100 GPU (training), edge devices (inference)
  • Quantization: INT8 for ~25MB size
  • Validation F1: ~0.93

Fine-Tuning Guide


import pandas as pd
from transformers import BertTokenizer, BertForSequenceClassification, Trainer, TrainingArguments
from sklearn.model_selection import train_test_split
import torch
from torch.utils.data import Dataset

# Load data
dataset_path = '/content/dataset.csv'
df = pd.read_csv(dataset_path)
df = df.dropna(subset=['Label'])
df.columns = ['text', 'label']

# Encode labels
labels = sorted(df["label"].unique())
label_to_id = {label: idx for idx, label in enumerate(labels)}
id_to_label = {idx: label for label, idx in label_to_id.items()}
df['label'] = df['label'].map(label_to_id)

# Train/val split
train_texts, val_texts, train_labels, val_labels = train_test_split(
    df['text'].tolist(), df['label'].tolist(), test_size=0.2, random_state=42
)

# Tokenizer
tokenizer = BertTokenizer.from_pretrained("boltuix/NeuroBERT-Pro")

# Dataset class
class SentimentDataset(Dataset):
    def __init__(self, texts, labels, tokenizer, max_length=128):
        self.texts = texts
        self.labels = labels
        self.tokenizer = tokenizer
        self.max_length = max_length

    def __len__(self):
        return len(self.texts)

    def __getitem__(self, idx):
        encoding = self.tokenizer(
            self.texts[idx],
            padding='max_length',
            truncation=True,
            max_length=self.max_length,
            return_tensors='pt'
        )
        return {
            'input_ids': encoding['input_ids'].squeeze(0),
            'attention_mask': encoding['attention_mask'].squeeze(0),
            'labels': torch.tensor(self.labels[idx], dtype=torch.long)
        }

# Load datasets
train_dataset = SentimentDataset(train_texts, train_labels, tokenizer)
val_dataset = SentimentDataset(val_texts, val_labels, tokenizer)

# Load model
model = BertForSequenceClassification.from_pretrained(
    "boltuix/NeuroBERT-Pro",
    num_labels=len(label_to_id)
)

# Ensure contiguous tensor layout
for param in model.parameters():
    param.data = param.data.contiguous()

# Training arguments
training_args = TrainingArguments(
    output_dir='./results',
    run_name="NeuroFeel",
    num_train_epochs=5,
    per_device_train_batch_size=16,
    per_device_eval_batch_size=16,
    warmup_steps=500,
    weight_decay=0.01,
    logging_dir='./logs',
    logging_steps=10,
    eval_strategy="epoch",
    report_to="none"
)

# Trainer setup
trainer = Trainer(
    model=model,
    args=training_args,
    train_dataset=train_dataset,
    eval_dataset=val_dataset
)

# Train and evaluate
trainer.train()
trainer.evaluate()

# Save model and label mappings
model.config.label2id = label_to_id
model.config.id2label = id_to_label
model.config.num_labels = len(label_to_id)
model.save_pretrained("./neuro-feel")
tokenizer.save_pretrained("./neuro-feel")
print("βœ… Training complete. Model and tokenizer saved to ./neuro-feel")
                        

Comparison to Other Models

ModelParametersSizeEdge/IoT FocusTasks
NeuroFeel~7M~25MBHighEmotion Detection, Classification
NeuroBERT~7M~30MBHighMLM, NER, Classification
BERT-Lite~2M~10MBHighMLM, NER, Classification
DistilBERT~66M~200MBModerateMLM, NER, Classification, Sentiment

Frequently Asked Questions (FAQ)

NeuroFeel is a lightweight model for detecting 13 emotions in short texts, optimized for edge AI and IoT.
It detects 13 emotions, including Happiness πŸ˜„, Sadness 😒, Love ❀️, and Sarcasm 😏.
Yes, it’s designed for offline use on edge devices.
Yes, it can be fine-tuned for custom emotions or other NLP tasks like QA or NER.
Runs on CPUs, NPUs, and microcontrollers with ~25MB storage and ~70MB RAM.

License

Apache-2.0 License: Free to use. See LICENSE.

Support & Community

Conclusion

NeuroFeel delivers real-time emotion detection with 13 nuanced categories, optimized for edge AI and IoT. Ideal for chatbots, social media, mental health, and wearables, it’s your solution for expressive AI in 2025. Explore it on Hugging Face!

Boltuix .store