You don’t need to use full AI models to get some of the benefits that have come out of the Gen AI / large language model (LLM) wave, especially when it comes to search.
One of the interesting components that have come out is practical uses for embeddings, which represents a concept as a high-dimensional vector (series of decimal numbers). While the idea of embeddings predate LLMs (such as Word2vec in 2013), the embedding models we have access to now like OpenAI’s text-embedding-3-small have been enhanced to understand context, for example the word “ruler” can refer to a measuring device or to a king. The term used here is “attention” in the sense of paying attention to the rest of the context. Humans can easily differentiate the two different meanings of “ruler” in the phrases “I want to measure this wood, please give me the ruler” and “King George was a ruler of England”.