Creates a new ToxBlock instance
Configuration options
When configuration is invalid
Checks if the provided text contains profanity or toxic content
The text to analyze
Promise resolving to detection result
When analysis fails
const result = await toxBlock.checkText('This is a test');
if (result.isProfane) {
console.log('Profanity detected!');
}
Checks multiple texts in batch
Array of texts to analyze
Promise resolving to array of detection results
When batch analysis fails
ToxBlock - A professional profanity detection module using Gemini AI
Provides comprehensive text analysis for detecting profanity, toxic content, hate speech, and inappropriate language across multiple languages.
ToxBlock
Example