This system automatically reads earnings call transcripts — the quarterly reports companies publish about their financial results — and produces a plain English summary, a sentiment score, and the key topics being discussed. No human reading required.
Earnings calls are long — executives talk for hours about revenue, costs, growth, risks, and guidance. This system ingests the full transcript automatically.
Using a language model called BART — the same type of AI used in advanced chatbots — the system condenses the key points into 2-3 sentences. No jargon, no fluff.
Using FinBERT — an AI model specifically trained on financial text — it determines whether the overall tone of the report is positive, negative, or neutral. It understands financial language that general models miss, like "flat revenue" being negative or "conservative guidance" being strategic.
It identifies the most important phrases discussed in the report — things like "Azure growing", "China revenue", "operating margin" — so you know exactly what the call focused on.
The system analyzed earnings calls from Apple, Microsoft, Alphabet, Amazon, and NVIDIA. Here is what it found: