Image
Supported formats
BMP, GIF, JPEG, PNG, TIFF, WebP, SVG, ICO
Description
This backend performs processing of image data in various formats, extracts metadata (including Exif) and performs additional analysis if necessary.
Available in Contextal Platform 1.0 and later.
Features
Brand Detection
The image processor runs our custom AI model optimized for visual detection of well-known brands, frequently impersonated in phishing attacks. It can spot logos anywhere within an image and automatically include the findings in the object metadata under the $logos key.
NSFW Detection
The backend uses a neural network to detect images with NSFW (Not Safe For Work) content, additionally providing a category of such content. It uses GantMan's model to identify the following categories:
Drawing- safe for work drawings (including anime)Hentai- hentai and pornographic drawingsPorn- pornographic images, sexual actsSexy- sexually explicit images, not pornographyNeutral- safe for work neutral images
The final verdict is stored in the object's metadata under the $nsfw_verdict key, which may have a value of Unknown in case there wasn't enough confidence to assign one of the above categories.
Optical Character Recognition
The backend performs optical character recognition (OCR) when requested and extracts text for further processing. The text processing backend can later detect the text's language, sentiment, profanities, embedded URLs, potential passwords, and more.
QR Code Detection & Processing
The backend detects QR codes in images, and decodes them for further processing by other backends such as Text, URL, or Domain.
Symbols
Object
LIMITS_REACHED→ limits triggered while processing an imageSVG_ERRORS→ errors detected while processing an SVG imageSVG_JS→ JavaScript detected inside an SVG imageBLURRED→ large portion of the image is blurred
Children
TOOBIG→ text extracted via OCR was not stored as it exceeds the limits
Example Metadata
{
"org": "ctx",
"object_id": "c4fc7668fdafdac2839b9dd7049da07b0a12cdab62cc165b878cccdb49478f31",
"object_type": "Image",
"object_subtype": "PNG",
"recursion_level": 1,
"size": 261648,
"hashes": {
"sha512": "15023992bd7568e98d283a3f951f8a4359ea3d58612be78bb9bfb72fd55ba111a98444d1a3a7b08383181cf3f27160f12e441a3ef16afd90e65afdee027a673e",
"sha256": "c4fc7668fdafdac2839b9dd7049da07b0a12cdab62cc165b878cccdb49478f31",
"md5": "73fa882983231c19dca6c5f8b4f20fc0",
"sha1": "02a7f35277c8f1f8321017d67419557943f69f2a"
},
"ctime": 1758828920.218504,
"entropy": 7.9582562355383555,
"relation_metadata": {
"name": "blurry.png"
},
"ok": {
"symbols": [
"BLURRED"
],
"object_metadata": {
"_backend_version": "2.0.0",
"format": "png",
"height": 960,
"logos": [
"adobe",
"adobe_block",
"adobe_italic",
"gmail",
"outlook"
],
"nsfw_predictions": {
"Drawings": 0.007282,
"Hentai": 0.000317,
"Neutral": 0.992383,
"Porn": 0.000005,
"Sexy": 0.000013
},
"nsfw_verdict": "Neutral",
"pixel_format": "RGB8",
"width": 1280
},
"children": []
}
}
Example Queries
object_type == "Email"
&& @has_descendant(object_image == "Image" &&
@match_object_meta($nsfw_verdict == "Hentai")
|| @match_object_meta($nsfw_verdict == "Sexy")
|| @match_object_meta($nsfw_verdict == "Porn")
)
- This query matches an
Emailobject, which at some level contains anImageobject with NSFW content.
object_type == "Image"
&& @has_child(object_type == "Text"
&& @match_object_meta($natural_language_profanity_count > 0))
- This query matches an
Imageobject, out of which aTextobject was extracted (via OCR), in which some profanities were identified.
Configuration Options
max_child_output_size→ maximum size of a single output children object (default: 41943040)