fix-summary-responsive.review.docs.asknews.dev
Open in
urlscan Pro
23.173.114.114
Public Scan
Submitted URL: https://fix-summary-responsive.review.docs.asknews.dev/
Effective URL: https://fix-summary-responsive.review.docs.asknews.dev/en
Submission: On June 21 via api from US — Scanned from DE
Effective URL: https://fix-summary-responsive.review.docs.asknews.dev/en
Submission: On June 21 via api from US — Scanned from DE
Form analysis
0 forms found in the DOMText Content
* Home * API Reference Search documentation...⌘K Search documentation...⌘K * * 🏠 Getting Started * 📰 News * 📖 Stories * 💬 Chat * 🔮 Forecast * 👄 Reddit * 📈 Analytics * 🔎 Sources * 🔗 LangChain # OVERVIEW The AskNews SDK allows you to access a variety of news related endpoints. This includes: * 📰 News: Natural language, low-latency, prompt-optimized, endpoint. Ready to be the news-context link in your LLM chain. * 📖 Stories: Get the hottest topics currently circulating in the news-sphere. Track sentiment, coverage, source origin, through time. Benefit from state-of-the-art clustering and custom written stories based on human-in-the-loop editorial. * 💬 Chat: News infused assistant, interact with it like any other OpenAI model. * 🔮 Forecast: Make any news-related forecast imaginable, based on rich temporal context and SOTA LLMs. * 👄 Reddit: Search Reddit live, get fully structured and GPT-summarized results - ready to contextualize your LLM. * 📈 Analytics: AskNews analytics on financial assets and politics. * 🔎 Sources: Review the distribution of sources underpinning AskNews for any given period of time. * 🔗 LangChain: Use AskNews as a LangChain retriever or tool. AskNews infuses any LLM with the latest news, using a single natural language query. Specifically, AskNews is enriching over 300k articles per day, indexing them into a hot vector database, and putting that vector database on an endpoint for you. When you query AskNews, you get back a prompt-optimized string that goes directly into your prompt. This means that you do not need to manage your own news RAG, and you do not need to worry about how to properly convey news information in a condensed way to your LLM. Python (sync)TypeScript 1from asknews_sdk import AskNewsSDK 2 3ask = AskNewsSDK() 4query = "Effect of fed policy on tech sector" 5 6# prompt-optimized string ready to go for any LLM: 7news_context = ask.news.search_news(query).as_string # WHAT IS ASKNEWS DOING TO KEEP DATA HOT AND HIGH-QUALITY? Building and managing a high-quality real-time news Retrieval Augmented Generation (RAG) architecture includes: * 🪣 Scraping 50k news websites every 5 minutes, * 🧽 Cleaning, summarizing, translating, and enriching 300k articles per day, * 🧮 Embedding the articles with dense and sparse vectors, * 💾 Storing the documents in an ever-growing vector database, * 🔬 Monitoring for quality and ensuring up-time reliability, * 🏎 Ensuring low-latency interactions to avoid slowing down your LLM application, * 🧑🔬️ Researching and developing state of the art (SOTA) methods for entity extraction. and quality/accuracy control, * 🧭 Researching and improving methods for retrieval on dense and sparse vector indices, * 🌍 Tracking news narratives through time with SOTA clustering/tracking methods. When you use AskNews, you get all this with a single line of code. # PERFORMANCE 🏎 A quantitative benchmark was run on AskNews vs JinaAI vs Tavily vs Exa, results indicate that AskNews is 1400% faster than JinaAI, and has 78% better context precision than all competitors. The full blog + Google Colab is available here. The summary of results is shown here: # RESEARCH 🧑🏽🔬️ We researched, developed, and deployed the best entity extraction model in the world, called GLiNER News. It is currently used to extract entities for all articles and stories in the AskNews database. It is also completely open-source, in-case you'd like to run the entity extraction yourself. You can find it on our HuggingFace repository. Our research extends beyond GLiNER, we are actively improving quality and innovating on all aspects of our system. Our full list of scientific publications can be found here. # QUALITY GUARANTEE 👑 This level of dedication to quality and transparency sets us apart from all other News APIs. When you consume AskNews content, you are getting only the best quality data, guaranteed. We even have our own Quality Assurance and Quality Control that runs on every single article/story to ensure that you will not run into a single article with incorrect content, scraping, etc. # TRANSPARENCY 🔬 Much of this quality comes from our dedication to high-quality foundational software. We also developed and open-sourced Flowdapt, which is a cluster orchestration software that enables highly reactive and adaptive artificial intelligence software. This software is what powers the AskNews backend, ensuring that we can keep up with the ever-changing news landscape, without missing a single article. Beyond these points, we are dedicated to open-source, with the majority of our software completely free and open-source. We are fully transparent about our datasources as well, with a transparency dashboard available to track and monitor the data. We are commonly invited to present on our open-source methodologies at conferences, such as GenAI Zurich. You can find all our presentations here. # SUPPORT If you need help with the AskNews SDK, please join our Discord, where you can ask questions, share your projects, and connect with other AskNews users. If you would like to explore the data before using the SDK, we encourage you to visit the AskNews website at https://asknews.app. # ACADEMIC USE We love supporting academic use-cases. If you are interested in accessing AskNews data for an academic project, please reach out to us at contact@asknews.app with a description of your project as well as contact information. In most cases, academic access to AskNews is completely free under the Limited License. # QUICKSTART # INSTALL We are going to setup an example that infuses GPT3.5 with the latest news, but you could substitute GPT3.5 with any LLM you like. First, install the python package: pip install asknews or install the TypeScript SDK: npm install @emergentmethods/asknews-typescript-sdk # QUICK NEWS CONTEXT IN YOUR LLM Next, import the AskNews and OpenAI SDKs: Python (sync)Python (async)TypeScript 1from asknews_sdk import AskNewsSDK 2from openai import OpenAI 3 4ask = AskNewsSDK( 5 client_id="your_client_id", 6 client_secret="your_client_secret", 7 scopes=["chat", "news", "stories"] 8) 9oai_client = OpenAI(api_key="") Your AskNews client_id and client_secret can be generated by going to the AskNews console and creating an account. You create your API credentials and set them in the AskNewsSDK constructor above. Let's take an example of your LLM interacting with a user who is asking about the current political situation in Germany. If you had your own RAG setup, you would use that query to go search a database and retrieve documents, structure them into a prompt-ready format, and it inject into your prompt. With AskNews, you treat it as a one-stop RAG shop. Here's how you would do it: Python (sync)Python (async)TypeScript 1def main(): 2 # Your user asks a question about the current political situation in Germany 3 user = { 4 "role": "user", 5 "content": "What is the current political situation in Germany?" 6 } 7 8 # Grab a prompt-optimized string ready to go for your LLM: 9 response = ask.news.search_news( 10 query=user["content"], # any natural language query 11 n_articles=10, # control the number of articles to include in the context 12 return_type="string", # you can also ask for "dicts" if you want more information 13 method="nl" # use "nl" for natural language for your search, or "kw" for keyword search 14 ) 15 # now you have a prompt optimized string: 16 news_articles = response.as_string 17 18 # simply infuse that string into the prompt: 19 system = { 20 "role": "system", 21 "content": "A chat between a curious user and an artificial intelligence Assistant. " 22 "The Assistant has access to the following news articles that " 23 f"may be useful for answering the User's questions: {news_articles}" 24 } 25 response = oai_client.chat.completions.create( 26 model="gpt-3.5-turbo", 27 messages=[system, user] 28 ) 29 30 print(response.choices[0].message.content) 31 32main() Now the response provides the latest updated information about the political situation in Germany: -------------------------------------------------------------------------------- Response from GPT3.5 The current political situation in Germany is dynamic and multifaceted. 1. Internal Security Concerns: There are doubts about Germany's internal security, with increasing threats from outside and vulnerabilities within the state structure, as highlighted in debates over strengthening the Federal Constitutional Court. 2. Fiscal Policy: The CDU's party leader, Friedrich Merz, has raised concerns about the financial feasibility of spending 40 billion euros on citizen's income in 2024, especially in the context of maintaining defense capabilities. 3. Political Shifts: The SPD has been criticized for moving away from its core voter base and reviving old Russia policies. This has sparked warnings about denying reality and concerns about Germany's foreign policy post-Merkel. 4. Youth Political Views: A survey found that a significant portion of first-time voters lean towards conservative views, some even trusting the far-right Alternative for Germany (AfD)...(LLM response truncated for brevity) -------------------------------------------------------------------------------- More details about filtering and sorting your news search can be found here. # GETTING THE HOTTEST TOPICS IN THE NEWS-SPHERE If you would like to obtain the hottest tech stories in North America, you can use SDK as follows: Python (sync)Python (async)TypeScript 1def get_stories(): 2 response = ask.stories.search_stories( 3 categories=["Technology", "Science"], 4 continent="North America", 5 sort_by="coverage", # sort by coverage 6 sort_type="desc", # highest to lowest coverage 7 reddit=3, # include 3 reddit threads 8 expand_updates=True, # get all the details for updates 9 max_updates=2, # get the 2 most recent updates for each story 10 max_articles=10 # include 10 news articles associated with each update 11 ) 12 print([story.updates[0].headline for story in response.stories]) 13 14get_stories() Which would return the hottest tech stories in North America, sorted by coverage, with the top 3 Reddit threads, and the 2 most recent updates for each story. Each story in the response is a custom AskNews-written story about a topic that is currently circulating in the news-sphere. The story includes a host of aggregated information such as sentiment, reddit opinion, clustered articles, coverage counts, origin diversity and much more. More details about filtering and sorting your stories search can be found here. # USE THE NEWS-INFUSED CHAT ENDPOINT If you would like to use the chat endpoint to ask questions about the news, you can use the SDK as follows: Python (sync)Python (async)TypeScript 1def chat_query(): 2 response = ask.chat.get_chat_completions( 3 messages=[ 4 { 5 "role": "user", 6 "content": "What is the top tech news?" 7 } 8 ], 9 stream=False 10 ) 11 12 # response object maches the OpenAI SDK response object: 13 print(response.choices[0].message.content) 14 15chat_query() Beyond news, there are other endpoints like stories, chat, analytics, and sources that you can explore. # ADDITIONAL LEARNING MATERIAL Check out our blog post explaining how to use the AskNews SDK to infuse news into your LLM. TABLE OF CONTENTS * What is AskNews doing to keep data hot and high-quality? * Performance 🏎 * Research 🧑🏽🔬️ * Quality Guarantee 👑 * Transparency 🔬 * Support * Academic use * Quickstart * Install * Quick News context in your LLM * Getting the hottest topics in the news-sphere * Use the News-infused Chat endpoint * Additional learning material Terms of ServicePrivacy PolicyCookie Preferences Made with ❤️ by © Emergent Methods, LLC Made with ❤️ by © Emergent Methods, LLC Terms of ServicePrivacy PolicyCookie Preferences WE USE COOKIES We use cookies to ensure you get the best experience on our website. Some of these cookies are provided by third parties. You are free to decide which categories you would like to permit and can withdraw this consent at any time (via cookie preferences link on the footer). You can either accept all cookies, reject all but the necessary cookies or click the "Preferences" button to decide which cookie categories you would like to enable or disable. Learn more on our privacy policy page. PreferencesOnly NecessaryAccept All