Brute Force (Flat)

This page describes the brute force (flat) index parameters as part of AI libs.

.ai.flat.normalize

The .ai.flat.normalize function normalizes vectors.

Cosine similarity is mathematically equivalent to the inner product metric performed on normalized vectors. By normalizing your vectors before inserting them into hnsw or flat, and also normalizing incoming search vectors, you can use the inner product metric instead of cosine similarity, thus yielding identical results. This significantly reduces search and insert times, as it removes repeated normalization.

You can use this function, however, note that it is optimized for speed, not memory utilization. If you're converting large vector stores, it's best to do them in smaller chunks using the formula {8h$x%sqrt sum x*x}.

Parameters

Name

Type(s)

Description

embs

real[][]

The original un-normalized vector embeddings

Returns

Type

Description

real[][]

Returns normalized vector embeddings

Example

q

Copy
q)\l ai-libs/init.q
q)vecs:{(x;y)#(x*y)?1e}[100000;10];
q)\ts:1000 res1:.ai.flat.search[vecs;first vecs;5;`CS]
676 2099232
q)nvecs:.ai.flat.normalize vecs;
q)\ts:1000 res2:.ai.flat.search[nvecs;first nvecs;5;`IP]
544 2099232
q)res1[1]~res2[1]
1b

.ai.flat.search

The .ai.flat.search function conducts a parallelized flat search, returning k nearest neighbors under the provided metric.

Parameters

Name

Type(s)

Description

embs

real[][]

The set of vectors to conduct kNN search against

q

real[] | float[]

The query vector

k

int | long

The number of nearest neighbors to return

metric

symbol

A metric for search, one of (L2;CS;IP)

Returns

Type

Description

(real[]; long[])

The nearest points and the corresponding distance under the given metric

Example

q

Copy
q)\l ai-libs/init.q
q)vecs:{(x;y)#(x*y)?1e}[1000;10];
q).ai.flat.search[vecs;first vecs;5;`L2]
0 0.2658583 0.3118592 0.3484065 0.3911282
0 404       631       93        241