DeepSeek looks like it’s preparing to move beyond chat and into search, right as AI starts to reshape how people look up information. Multiple job postings this month suggest the Chinese startup is building DeepSeek AI search, a product aimed at multilingual queries and newer kinds of inputs, Bloomberg reports.
The work described in the ads points to a multimodal engine, one that can take text, images, and audio. That’s a direct fit for phone-first searching, where a screenshot, photo, or voice clip is often the real query.
The job posts spell out the build
The roles call for specialists to build an AI search engine that supports different languages and can process more than typed text. That suggests DeepSeek wants results that feel closer to an answer than a link list, especially when the input is messy.
The same listings also stress the infrastructure that makes AI search reliable, training data, evaluation systems, and platforms designed to support the work. That’s the less flashy side of the product, but it’s the part that decides whether an AI search tool is helpful or confidently wrong. It’s a practical signal that DeepSeek is moving from model bragging rights into day-to-day utility.
The timing adds weight. DeepSeek rattled the AI sector last January with its R1 model, which rivaled leading US options and was said to cost far less to build. Since then, industry watchers have been waiting for the next step.
Why this matters for Google
Google still owns the default search habit, but AI is eating into the experience people care about most, getting something useful fast. A multilingual, multimodal entrant could challenge Google where classic search struggles, by interpreting images and audio instead of forcing everything into keywords.
DeepSeek’s hiring also points to agents, tools meant to run with limited human oversight. The company is staffing for agent training data, evaluation, and dedicated platforms, and it even signals an expectation of many agents running persistently. Put those pieces together and you get a broader play, search finds information, agents act on it, and the product starts to look more like an assistant than a search page.
What to watch next
The biggest question is distribution. The postings don’t say if DeepSeek AI search becomes a standalone destination, a developer API, or a feature inside an existing service, and that choice will determine who touches it first.
There are a couple of hints to track. DeepSeek published a paper in late December outlining a more efficient approach to developing AI, and it recently referenced “model1” on its public GitHub account. If DeepSeek follows up with a public search preview or ships persistent agent features people can try, it’ll be the clearest sign yet that the company wants a real slice of Google’s turf.









































