Getting started with Odin
Prompting like a Pro
12 min
even when working with large language models (llms), clear communication remains essential this article outlines best practices for writing effective prompts when using various features of the getfocus platform before diving into how to prompt within specific features, there's one overarching principle to keep in mind 🗝️ be clear and specific about your requirements and the output you expect 🗝️ be clear and specific about your requirements and the output you expect llms have evolved significantly, and with improved reasoning capabilities, there's no longer a need for overly complex prompts or assigning roles like "you are a market analyst " instead, focus on clarity and precision prompting in general chat prompting in general chat in the general chat , you can explore any topic and get an initial understanding of it this is useful for quick learning, brainstorming, or background research when using the technology scouting feature, the way you phrase your prompt has a major impact on the results the system can categorize a wide range of technologies, from established to emerging and niche, but the specificity of your prompt determines how broad or focused the returned list will be here are two examples to illustrate this 🔋 example 1 battery technologies general prompt "energy storage technologies" returns a broad range of technologies such as lithium ion batteries, flow batteries, supercapacitors, and mechanical storage systems specific prompt "battery chemistries" focuses the results on different chemical compositions, such as lithium sulfur, sodium ion, or solid state batteries highly specific prompt "battery chemistry for electric aviation" narrows the scope even further to technologies optimized for weight, energy density, and safety in aviation use cases each level of specificity results in a different depth and focus of technologies 🧬 example 2 medical diagnostics general prompt "medical diagnostic technologies" returns a wide set of categories including imaging technologies, wearable sensors, biosensors, and lab on a chip devices highly specific prompt "diagnostic technologies for early cancer detection" targets a narrower set of tools such as liquid biopsies, circulating tumor dna (ctdna) analysis, and ai based imaging solutions again, a more focused prompt will guide the system toward more targeted, relevant technologies after generating a list of technologies, the next step is usually to down select and identify the most relevant options for your use case as explained in getfocus feature guide docid\ i56bdlagpkhxml2eia ww , it is important during scouting to ensure that the technologies you compare are truly competing (serving as alternatives) and that comparisons are made like with like here are some example prompts that can help you achieve this "identify which of the scouted technologies are competing alternatives for energy storage " "identify which of the scouted technologies are comparable head to head " when it comes to evaluation, there are two main approaches, depending on how much control you want to retain ✅ option 1 give more control to the llm if speed is your priority, you can let the model take the lead for example prompt example "evaluate all technologies for the following application area energy storage for ev vehicles " this approach is fast but gives the llm broad freedom in how it interprets and structures the evaluation ✅ option 2 stay in control if you prefer a more structured and tailored evaluation, include specific instructions in your prompt for example define evaluation criteria (e g , pressure resistance, weight, emissions) request scoring or binary values (e g , 1–10 scale, or yes/no) specify the output format for clarity (e g , a table with scores and reasoning) prompt example "evaluate all technologies with a score from 1 to 10 based on their ability to withstand high temperatures provide the output in a table format, including a short explanation for each score " this way, you remain in control of the evaluation framework while still leveraging the llm’s reasoning capabilities write effective search queries write effective search queries searching in getfocus is straightforward there are multiple ways to find relevant patents (read more in searching with odin docid\ a76atae2twqap3j9qm0w5 ) however, when searching by technology name, it’s worth giving extra thought to your query when searching for patents, pay attention to the level of detail in your query patents are legal documents that often cover multiple applications, materials, or chemistries rather than focusing on just one the more detailed your query, the narrower and deeper your results will be the broader your query, the more diverse your results become let’s take a look at some search query examples very specific “hydrometallurgical recycling of lithium ion batteries for recovery of cobalt and nickel ” very narrow only patents that match this precise combination will appear risk results may be too few, especially in niche or emerging domains use this approach when you want to quickly check whether a technology is already patented for a specific material, use case, or application specific “hydrometallurgical recycling of lithium ion batteries ” broader than the previous one, retrieving more patents for an overview of relevant inventions best for building a dataset when you want a full picture of how a technology is applied to a specific use case works well for established technologies where enough data exists broader “hydrometallurgical recycling of batteries ” returns a large and varied set of patents, including some only loosely connected to your original intent useful for scouting emerging or niche domains, where being too specific might exclude valuable results advantage increases the chance of discovering unexpected but relevant insights, since patents often use broad legal and technical language with these examples in mind, you can adjust your query detail depending on what your goal is f f iltering datasets with the llm filter another key area where prompting skills are important is when filtering datasets with the llm filter (for an overview of what the llm filter is and how it works, see the article “ docid\ wi3kku8jft0fpwd7xnrnz ” ) as with any other prompt, clarity in your instructions is essential the way you phrase your filter prompt directly impacts the quality and precision of the results here are some best practices for writing effective llm filter prompts start with a clear filtering command example “find/only include patents/inventions that…” use strong, directive phrasing prefer words like “explicitly,” “specifically,” "primary," “focus on,” or “discuss ” avoid vague verbs like “mention,” since patents often reference technologies without making them the central subject combine technology and use case requirements example “the patent must explicitly state both the technology (lithium sulfur batteries) and the use case (electric aviation) ” define what counts as “relevant” example “include patents that specifically propose a novel technical solution, not just incremental design variations ” use exclusion rules for precision example “if not all conditions are met, exclude the patent ” this enforces strict filtering and ensures only highly relevant results are returned example “exclude patents focused solely on consumer electronics, unless they directly address energy storage at grid scale ” to make these best practices more concrete, let’s look at a few examples of bad prompt versus well structured ones ❌ bad prompt "show me patents that mention batteries " too vague — doesn’t specify technology, use case, or context likely to return irrelevant results where batteries are only referenced in passing ✅ good prompt "find patents that explicitly discuss lithium ion battery technologies " clear and directive focuses results on patents where lithium ion batteries are the main subject ✅ good prompt "only include patents that explicitly state both the technology (solid state batteries) and the use case (electric aviation) if both conditions are not met, exclude the patent " includes multiple conditions for precision uses strict exclusion logic to filter out noise prompting for chat with set and chat with invention prompting for chat with set and chat with invention another area where prompting skills are essential is when performing deep dives using chat with set (cws) or chat with invention (cwi) prompting tips for chat with set prompting tips for chat with set remember in chat with set , the responses are drawn from the patent data in the dataset you’ve created however, during longer conversations, the llm may also draw on its general knowledge to keep results anchored in the dataset, include phrases like “according to the set…” “based on the patent data…” to help you get started, here are a few ideas for prompts you might try in cws example “according to the set, what are the key innovation drivers of \[technology name]?” example “what are the recent advancements in \[technology x] according to the set?” ⚠️ keep in mind that the llm will decide on its own what counts as “recent ” to stay in full control, either specify the years directly in your prompt (e g , “…between 2018 and 2023” ) or apply a publication date filter before using cws example "according to the set, compare the patent portfolios of \[organization a] and \[organization b] in the field of \[technology name] identify white spaces by highlighting areas where one organization is active but the other has limited or no coverage provide output in a table " example "according to the set, what are the main technical challenges in the development and application of \[technology name]? summarize the challenges in a structured list, and for each one provide a short explanation based on the patent data " prompting tips for chat with invention prompting tips for chat with invention when it comes to chat with invention , the main recommendation remains the same keep your prompts clear and directive here are some examples to help you get started example “list the main claims of this patent and summarize each in one sentence ” example “list the materials mentioned in this invention and provide the output in a table with columns for material, role, and section of the patent where it appears ” example “based on the claims, what aspects of this invention appear to be novel compared to conventional lithium ion batteries?” example “according to this patent, what potential application areas are explicitly described or implied for this invention?” example “summarize how this invention could be applied in electric aviation, including any technical advantages mentioned in the patent text ” example “what technical challenges are acknowledged or suggested in this patent regarding scalability of the invention?”