Innodata Inc. (INOD) has reported a significant increase in revenue for the first quarter of 2024, with a record $26.5 million, marking a 41% year-over-year growth. The company has raised its revenue guidance for the year, expecting at least 40% organic growth, driven by its success in developing AI large language models (LLMs) for big tech companies. Innodata's CEO Jack Abuhoff expressed optimism about the company's growth potential and its recent customer acquisitions. The company also highlighted its commitment to model safety and ethical AI practices, with the release of an open-source LLM Evaluation Toolkit.
Key Takeaways
- Innodata's Q1 2024 revenue reached $26.5 million, a 41% increase year-over-year.
- The company raised its 2024 revenue guidance to at least 40% organic growth.
- Success attributed to partnerships with large tech companies for AI LLMs development.
- Innodata has signed agreements with five big tech companies and added two more this quarter.
- The enterprise market is expected to surpass the big tech market in generative AI adoption.
- Innodata plans to invest in recruiting, sales, marketing, and product development.
- The company maintains a strong cash balance of $19 million and generated $5.2 million in cash from operations and receivables.
Company Outlook
- Innodata anticipates continued growth through its customer relationships in the big tech and enterprise markets.
- The company is preparing to offer a range of services to support enterprise customers.
- There are plans to release five more generative AI feature updates in the coming months.
- Innodata aims to maintain its leadership in generative AI services.
Bearish Highlights
- The company is facing the need to invest significantly in talent, with approximately $3.5 million earmarked for recruiting costs.
Bullish Highlights
- Innodata has secured engagements in the enterprise market, including a workflow system for a legal information company.
- The company has released an open-source LLM Evaluation Toolkit to support responsible AI practices.
- New customer wins are anticipated to contribute approximately $23.5 million and $20 million of annualized revenue.
Misses
- There were no specific misses mentioned in the earnings call summary provided.
Q&A Highlights
- CEO Jack Abuhoff expressed confidence in the company's multiyear growth potential.
- The company's 40% growth target for the year is considered conservative given the opportunities in their pipeline.
- Investments in compute, data science, and data engineering work are being made to support customer investments in AI models.
- The cost of goods is expected to be around 37% adjusted gross margin.
In conclusion, Innodata's first quarter of 2024 has been marked by strong financial performance and strategic growth in the generative AI market. The company's optimistic outlook and increased revenue guidance reflect the strength of its customer relationships and the expanding market demand for AI-driven solutions. With a focus on innovation, model safety, and ethical AI, Innodata is positioning itself to capitalize on the opportunities presented by the burgeoning enterprise market for AI technologies.
InvestingPro Insights
Innodata Inc. (INOD) has shown a resilient performance with a striking 41% year-over-year revenue growth in Q1 2024, as highlighted in the article. To further understand the company's financial health and market position, let's consider some key InvestingPro Data and InvestingPro Tips.
InvestingPro Data:
- The company's market capitalization stands at a solid 193.79 million USD, reflecting investor confidence in its business model and future prospects.
- With a notable revenue growth of 9.84% in the last twelve months as of Q1 2024, Innodata is demonstrating its ability to increase its earnings effectively.
- Despite not being profitable over the last twelve months, as indicated by a negative P/E ratio of -209.06, the company's gross profit margin of 37.86% suggests it is maintaining a healthy difference between the cost of goods sold and net sales.
InvestingPro Tips:
- Innodata holds more cash than debt on its balance sheet, which is a positive sign for investors looking for a company with a strong liquidity position.
- The stock has experienced significant return over the last week, with a 12.15% price total return, signaling current investor enthusiasm.
InvestingPro has additional tips that could be valuable to anyone looking to dive deeper into Innodata's financial metrics and stock performance. For instance, while the stock trades with high price volatility, it has also shown a strong return over the last five years. These insights can be particularly useful for investors who are considering the long-term potential of the company amidst short-term fluctuations.
To access these insights and more, visit https://www.investing.com/pro/INOD and remember to use coupon code PRONEWS24 to get an additional 10% off a yearly or biyearly Pro and Pro+ subscription. There are 10 additional InvestingPro Tips available for Innodata, which could help investors make more informed decisions.
Full transcript - Innodata Isogen (INOD) Q1 2024:
Operator: Greetings. Welcome to Innodata First Quarter 2024 Results Conference Call. At this time, all participants are in a listen-only mode. A question-and-answer session will follow the formal presentation. [Operator Instructions] Please note, this conference is being recorded. I will now turn the conference over to your host, Amy Agress, General Counsel at Innodata. You may begin.
Amy Agress: Thank you, Paul. Good afternoon, everyone. Thank you for joining us today. Our speakers today are Jack Abuhoff, CEO of Innodata; and Marissa Espineli, Interim CFO. Also on the call today is, Aneesh Pendharkar, Senior Vice President of Finance and Corporate Development. We’ll hear from Jack first, who will provide perspective about the business, and then Marissa will follow with a review of our results for the fourth quarter. We’ll then take your questions. Before we get started, I’d like to remind everyone that during this call, we will be making forward-looking statements, which are predictions, projections or other statements about future events. These statements are based on current expectations, assumptions and estimates, and are subject to risks and uncertainties. Actual results could differ materially from those contemplated by these forward-looking statements. Factors that could cause these results to differ materially are set forth in today’s earnings press release in the Risk Factor section of our Form 10-K, Form 10-Q and other reports and filings with the Securities and Exchange Commission. We undertake no obligation to update forward-looking information. In addition, during this call, we may discuss certain non-GAAP financial measures. In our SEC filings, which are posted on our website, you will find additional disclosures regarding these non-GAAP financial measures, including reconciliations of these measures with comparable GAAP measures. Thank you. I will now turn the call over to Jack.
Jack Abuhoff: Good afternoon. We are very excited to be here with you today. We have lots of updates to share regarding the accelerated momentum we are experiencing across our business. First and foremost, we are pleased to announce record revenues for the quarter of $26.5 million, representing 41% year-over-year growth. Our growth in the quarter was driven by the value we are bringing to help the world’s largest tech companies build AI large language models, or LLMs. As a result of accelerated business momentum, we are raising our 2024 revenue guidance to an expected organic revenue growth of at least 40% year-over-year. This is double the growth rate we guided to last quarter. We are executing a multipronged strategy to deliver – designed to deliver extraordinary levels of growth over the next several years as we extend what we believe is our early leadership in generative AI solutions. We’re focused on providing solutions at three levels of the Gen AI stack at the bottom layer, helping some of the world’s largest tech companies and independent software vendors, or ISVs, develop generative AI foundation models. In the middle layer, helping enterprises that prefer not to build models from scratch, but rather to leverage existing LLMs and other AI customized for them with their own data. And at the top layer, building generative AI-enabled platforms that are useful for niche industry requirements. Our primary focus this year is on that first layer of the stack, partnering with some of the world’s largest tech companies to develop generative AI foundation models. We are pleased with the success we are having thus far. We entered the year with agreements in place with five of the so-called Magnificent Seven companies, which are our group of well known, high performing Big Tech companies we believe will spend billions of dollars on generative AI data engineering over the next several years. We announced today that we rewarded yet another program expansion from one of our Big Tech customers. We’re valuing this expansion at approximately $23.5 million of annualized run rate revenue once implemented. This is on top of the $20 million in new programs with this customer we announced less than two weeks ago on April 24. We expect that these programs will ramp up over the next two months. While our customer agreements typically contain early termination upon notice provisions, we believe this customer is committed to a significant multiyear LLM strategy from which we stand to benefit. In fact, we are in discussions with this customer regarding potential new programs and expansions beyond what we have announced so far. We also signed two additional Big Tech companies, a large, prominent generative AI company and a large prominent consumer facing ISV, investing substantially in generative AI foundation models. As a result of these new wins, we now serve seven Big Tech customers. We believe we will continue to grow with these customer relationships in 2024, and that we may grow some of them, possibly quite substantially. For Big Tech customers, we provide a broad range of services to support their generative AI programs. This includes creating instruction datasets, which you can think of as the programming behind large language models. It also includes human preference data used in reinforcement learning and reward modeling to align models to human preferences and build guardrails against toxic biases and harmful responses. In a blog post last month that accompanied a major release, one of our large Big Tech customers stated that the quality of these instruction data sets has an outsized influence on the performance of their models, and that some of their biggest improvements in model quality come from carefully crafted instruction data sets and multiple rounds of quality assurance. This statement crystallizes why we have become the partner of choice for such customers. We believe we are well positioned to anticipate Big Tech’s changing needs and to grow with them. It is evident that Big Tech’s aspirations extend beyond today’s predominantly text-to-text English language models. We foresee expansion in terms of multimodal models, domain and task specific models, models natively built in more than 30 different languages, and models capable of complex reasoning. All of these dimensions will require modeling with the kind of data that we create. We believe we are still in the early innings of this journey. I encourage you to read the latest quarterly earnings transcripts from the Mag Seven, generative AI is a prevailing theme, with promises of more gen AI models, more gen AI and products, and commitments to multiyear investment cycles and CapEx increases to support aggressive AI research and product development. We believe the emerging enterprise market, which we call the middle layer, consisting of companies across verticals that seek to adopt generative AI technologies to be another important growth factor for Innodata, one that will ultimately dwarf the Big Tech market for us. In parallel with executing strategies to penetrate Big Tech, we’re taking steps to prepare for what we foresee as a likely explosion in the enterprise space. We believe, we are very well positioned due to our intimate knowledge of the gen AI roadmap of large tech companies, which has enabled us to gain exceptional domain expertise and the future product needs of the enterprise market. We believe that enterprise adoption is about to enter a Cambrian period of explosive growth as a result primarily of three technology developments now underway. I’ll explain and illustrate each with examples. Today, enterprise users of generative AI are mostly using ChatGPT as a standalone application. We’ll call this level-zero use case. For example, if I’m an HR Director at Innodata test with revising Innodata’s employee handbook, I can prompt ChatGPT to write a first draft of the vacation policy. Companies are now shifting from this level-zero to what we think of as level-one. We think of level-one systems as those based on Retrieval-Augmented Generation, or RAG, which we believe are likely to become better performing for reasons I will explain shortly. RAG systems couple search technology and prompt engineering with such a level-one system and Innodata employee might prompt an Innodata HR chatbot with a request like please summarize for me Innodata’s vacation policy. A search engine working behind the scenes would then retrieve Innodata’s vacation policy from a large document repository and insert it into the prompt of context, with an instruction to the LLM to answer the question, primarily based on the inserted policy. RAG-based systems are about to become more useful as the latest crop of soon to be released LLMs or for significantly expanded context windows. A context window refers to the amount of retrieved information that can be included with a prompt. By including more context, the chatbot can become more consistent, relevant, and useful. One of the Big Tech companies is about to release a new model with a context window that is eight times larger than that of OpenAI’s GPT-4 Turbo, enabling you to include, for example, 3000 pages of documents in a single prompt. Today’s expert or advanced expert augmentation systems are, for the most part RAG-based systems that combine generative AI with humans in the loop to deliver improved productivity. In a few minutes, I’ll give you an example of such a system we started working on for a customer in the quarter. We believe a second technology development called agentic workflows will enable what we’ll call level-two systems. With an agentic system, rather than asking a question to a chatbot, you present a goal to a virtual agent. Your virtual agent then accesses multiple back end systems and LLMs talk to each other to accomplish your goal. Agentic workflows really open up the kinds of things you can ask computers to do with LLMs. With an agentic system, an Innodata employee might ask a virtual Innodata agent, please look up how many days off Innodata employees get, check how many days off I have left, and request a week off around my son’s graduation, so long as there are still available hotels in Boston. Imagine that. Now, while the full realization of agentic workflows may be years away, we believe incremental progress is being achieved and will likely accelerate. The third development that we believe will accelerate enterprise adoption is that the cost of training and serving models is likely to go down dramatically, making it possible for enterprises to train and serve models at scale. Once this happens, we believe that companies are likely to want to fine tune their own models rather than relying on RAG-based architectures. We’ll call these level-three systems. Level-three systems will support more complex use cases and enable sensitive information to be processed in private clouds or on premises rather than being served up as context to third-party foundation models. We intend Innodata to offer enterprise all of the services they require to navigate the journey from level-zero to level-three and beyond. This will include custom development, integration and fine tuning services, as well as managed services around data readiness and data governance and industry specific workflow platforms. We are not alone in our thinking that the enterprise market for generative AI is about to explode. In a report last year, Bloomberg estimated the market for generative AI focused IT services will grow to nearly $22 billion by 2027 and to nearly $86 billion by 2032, representing a 100% compounded annual growth rate for the 10 year period from 2022 to 2032. To position ourselves to drive enterprise growth, we are expanding our talent base, creating new accelerators, and winning new reference engagements. This quarter, one of the largest legal information companies in the world engaged us to develop a new LLM-based workflow system for their complex operational processes spanning legal and regulatory law in multiple European countries and multiple languages. This is an example of what I referred to a few minutes ago as an advanced level-one system. Our implementation uses GPT and a combination of several techniques, including chain of density prompt engineering and a vector database with similarity matching. Our generative AI-enabled workflow system is expected to enable the customer to drive significant operational savings across high cost processes that previously relied entirely on humans with language and legal expertise. This quarter, we also delivered a generative AI powered tool that gathers on the fly insights from large-scale textual data, contextually analyzes the data for specific areas of interest, and performs language translations. The technology aims to increase organizations efficiency by ensuring knowledge workers are equipped with the intelligence needed to make informed decisions. We built it into our agility public relations platform where we call it Intelligent Insights. We made Intelligent Insights generally available to agility customers in the quarter, and it has been well received. To build the solution, we utilize RAG-based prompt engineering. We recently demonstrated it as an accelerator to one of our large banking customers and it inspired a POC that we are now executing. Agility revenue in the quarter increased 16.5% year-over-year. We have over 1,400 direct customers and we’re generating cash. We’ve been a leader in the industry, rolling out cutting edge generative AI functionality that is bending the productivity curve for PR and communications professionals. We started early last year with the release of PR CoPilot, the generative AI implementation that helps people write press releases and media pitches. In Q1, we announced the general availability of Intelligent Insights. We are planning another five significant generative AI feature releases over the course of the second half of this year and into the first quarter next year. As a result of what we believe is our generative AI leadership in the PR space. In the first quarter, we converted over 35% of demos to wins, up from less than 20% prior to implementation of our generative AI roadmap. Our customers told us that one of their biggest challenges was that they needed more hours in the day. With our generative AI innovations, we’re making tactical PR a less labor intensive process, giving our customers the time back they need for strategic thinking. For both the Big Tech market as well as the enterprise market, we see additional opportunities in model safety, evaluation and responsible and ethical AI. We began working on trust and safety for one of our Big Tech customers in Q4 2023, providing model assessment and benchmarking services, which help ensure that models meet performance, risk and emerging regulatory requirements. We learned a lot from the work and development we’ve been doing on this engagement, so we decided to share our learnings, tools and innovation with the market more broadly. Just a couple of weeks ago, we announced our release of an open-source LLM Evaluation Toolkit, together with a repository of 14 semi-synthetic and human crafted evaluation data sets that enterprises can utilize for evaluating the safety of their large language models in the context of enterprise tasks. Using the toolkit and the datasets, data scientists can automatically test the safety of underlying LLMs across multiple harm categories simultaneously, developers can understand how their AI systems respond to a variety of prompts and can identify remedial fine tuning required to align the systems to the desired outcomes. We expect to release a commercial version of the toolkit and more extensive, continually updated benchmarking datasets later this year. In Q1, we won two additional engagements for LLM safety and evaluation. One for a hyperscaler’s own foundation models and one for an enterprise customer of the hyperscaler through the white label program we have in place with the hyperscaler. In addition, in Q1 2024, we started pilots for a new customer and an existing customer around LLM Trust and Safety. I’ll conclude with this we believe we have an incredible opportunity in front of us. We believe we have the talent, capabilities and scalability to support the world’s leading company’s efforts to build AI models and services and to help enterprises advanced AI and generative AI technologies. We believe we can drive best-in-class growth over the next several years and maintain our early leadership position in generative AI services. Moreover, we believe we can accomplish this without the need to raise equity, to incur debt, or to burn cash. This year, based on our current growth forecast, we intend to invest approximately $3.5 million in recruiting costs to scale our business and approximately another $3 million in new sales, marketing and product development talent. The recruiting costs relate to the significant increase in revenues we expect this year and will not be incurred next year to support that revenue going forward. The investment in sales, marketing and product development are encouraged to continue our growth momentum and we anticipate that they will yield revenue and profitability benefits primarily next year and beyond. We anticipate approximately 70% of the recurring of the recruiting costs to be incurred in Q2 and most of the OpEx investment to be incurred in the second half of the year. We are making these investments while simultaneously driving year-over-year growth in adjusted EBITDA and building cash on our balance sheet. At the end of Q1, our cash balances were $19 million, up from $13.8 million at the end of Q4 2023, driven by positive cash flow from operations and tight working capital management. I’ll now turn the call over to Marissa to go over the numbers and then we’ll open the line for some questions.
Marissa Espineli: Thank you, Jack, and good afternoon everyone. Let me briefly share with you our 2024 first quarter financial results. Revenue was $26.5 million, up 41% from $18.8 million in the same period last year. Net income was $1 million, or $0.03 per basic and diluted share, compared to a net loss of $2.1 million, or $0.08 per basic and diluted share in the same period last year. The adjusted EBITDA was $3.8 million compared to adjusted EBITDA of $0.8 million in the same period last year. Our cash and cash equivalent and short term investments were $19 million at March 31, 2024 and $13.8 million at December 31, 2023. We currently have unused line of credit of $10 million with $9.2 million as borrowing days. So thank you everyone. Paul, we’re ready for questions.
Operator: [Operator Instructions] The first question today is coming from Arya Cole [ph] from Cole Capital. Arya, your line is live.
Unidentified Analyst: Yes, thank you. Again, congratulations on the good sales results and the good feedback you’re getting from customers who are asking, giving you additional business. Jack, one quick question. Clearly, companies can grow organically. They can also grow through acquisition. I was trying to get an update on your thought process for how important acquisitions could be as a part of your growth going forward?
Jack Abuhoff: Arya, thank you for the question. So the business plan that we’re currently executing successfully is an organic only plan. We’ve got a strategy in place that we think, we can successfully deliver through organic growth and through the resources that are available to us and that we’re able to augment our team with, from time to time, we will kick the tires on potential inorganic opportunities. But I think the important thing is that we’re not dependent upon that for growth. We recognize that those come with risks that are not necessarily risks that one incurs with a or get primarily organic strategy. And that’s the way we’re executing right now.
Unidentified Analyst: Understood. Then the follow up is looking at organic growth. When you look at the financial employees, what sort of sales per employee can you generate when you’re primarily providing services for these large technology companies? And what sort of – what is the gross margin range you think you can achieve when people’s time is being optimally utilized?
Jack Abuhoff: Sure. So, we target – at a consolidated level, we target about 40% adjusted gross margin. When you back out the one-time recruiting charges that relate to the revenue that we’re scaling up for. And if you back out sales and marketing investments that we plan to have to be returning on that investment in the following year. From a revenue per employee, we don’t track that although it is notable, and it’s not lost on us but, our revenue employee has gone or with the large language model work that we’re doing is probably four to even five times in excess of what it’s been historically with the business process management services.
Unidentified Analyst: Got it. And just finally, in terms of employee retention, you’re going to be in a more competitive marketplace for employees in the future. And I’m just wondering what you’re putting in place to try and make Innodata a place that people really enjoy working at, where your churn hopefully, our turnover employees will be kept to a minimal level?
Jack Abuhoff: So for the most part, over the years, churn has not presented a problem for us that we weren’t able to manage. We were able to keep it at a level that was way below that of most of our competitors. Just this past quarter, we were elected as one of the best places to work by, in two different of our locations. So, we believe that we will be able to manage it. And we’re very excited about the – obviously very excited about the growth opportunity in front of us.
Unidentified Analyst: Okay, great. Well, thank you and best of luck.
Jack Abuhoff: Thank you.
Operator: Thank you. The next question is coming from Tim Clarkson from Van Clemens. Tim, your line is live.
Tim Clarkson: Hey, Jack. Obviously a great quarter. Just want to go over some of the basics for some of us luddites here. So when I originally realized that Innodata was getting involved in artificial intelligence, the one thing that was really helpful for me understanding why you guys are successful is just the accuracy. And the top guy from IBM (NYSE:IBM) explained it to me that at IBM, they had a 75% failure rate, largely because of inaccuracies. And what I’m hearing is you guys make, like, three mistakes and 20,000 annotations. Can you just talk about how that’s really, it seems simple, but it’s a real foundational skill. You’re doing more than that, obviously, but that really is what initially separates you from the competition in a big way.
Jack Abuhoff: Tim, thank you for the question. So, absolutely. I think data quality, has been proven to lead to great model performance. And we have customers that I mentioned a few minutes ago who, with their new releases, are attributing their incremental improvements to their data quality. And we’re working for those customers. So, we’re thrilled that, we’re producing those great outcomes for our customers, and we’re seeing that those great outcomes are leading to, expansions and growth. In terms of LLMs, there are three critical ingredients. There’s compute, there’s data science, and then there’s data engineering work that we do. Given the investments that these companies are making and the kind of risk that they’re taking and the strategic emphasis that they’re placing on these developments, its critical to them to have the kind of data that will result in the models that they, and the accuracy of the models that they’re building. So, we think we’re very well positioned. We’re doing great work, and we think the future is very bright.
Tim Clarkson: Right. Now, just as kind of an extension to that idea. I mean, one of the challenges, obviously, what Innodata historically was, you do a project and then it ended, and there wasn’t a durability of revenues. Can you comment on how durable you think this new business model is going to be into the future?
Jack Abuhoff: So, I think it’s extremely early days still. We see lots of developments that are taking place. I mentioned several of those. We believe that we’re still in the early innings of executing all of these opportunities. We’re starting to now talk to our customers about multimodal models, about multiple languages, more complex reasoning tasks that they want the models to be able to take on, domain and task specificity. All of these things are early for them, and we think we’re early in our partnerships with them such that we’re going to be able to drive multiyear growth as a result of their multiyear investments. At the same time, we’re landing more companies and we’re getting into the enterprise opportunity. So, we think there’s a tremendous amount of growth opportunity in front of us.
Tim Clarkson: Right. Now historically, back in the, I don’t know if you call it the good old days, but the old days when you guys were picking on projects, you’re doing $20 million [ph], I remember netting 15%, 20% after tax, and it seems like you got these one-time expenses with sales investments, but separating those out, it seems like there’s certainly a potential to have 15% to 20% pretax margins once you guys get onto a solid run rate.
Jack Abuhoff: So, yes, I think that’s certainly what we’re pursuing. We think the opportunity is there. We also think it’s important, as I mentioned, to be investing in our growth. We believe that we have an opportunity in front of us to create a truly great company, and we’re very excited about that.
Tim Clarkson: Now, these investments would be directly tied into AI then right, on that part of the business?
Jack Abuhoff: Exactly right.
Tim Clarkson: Right. Great. Well, good luck. It’s an exciting future. Thank you.
Jack Abuhoff: Thank you, Tim.
Operator: Thank you. The next question will be from Dana Buska from Feltl. Dana, your line is live.
Dana Buska: Hi, Jack.
Jack Abuhoff: Hey, Dana.
Dana Buska: Congratulations on a wonderful quarter. I have just a couple questions for you. You talked a little bit about the demand for training on enterprise data. Can you talk a little bit more about that? And are you seeing more enterprises look to train their own models?
Jack Abuhoff: I think we’re going to see more enterprises training their own models when the cost of doing so comes down, which we believe is inevitable. For the most part, what enterprises are looking to do now is to use their own data as context within RAG implementations. And even with that, we see a lot of opportunities that are now being piloted and POCs very early days in terms of getting things into development or past development into implementation. We think we’re going to be there in order to help accomplish that. Again, quality of data that feeds those models and the kinds of integrations that become possible, especially with large context windows, is going to improve the results of those models and what can be achieved with them. We’re very well positioned to help enterprises move things from POCs into development, and we’re doing that now, today.
Dana Buska: Okay, great. And along those lines, I seem to read or come across articles about how companies are coming up with tools and software to speed up fine-tuning and doing trust and safety, could you talk a little bit about how that may affect your business? And do you consider that to be a threat?
Jack Abuhoff: No, we consider that to be an opportunity. The more success we have and the world has it, moving things from POC into development, the more acceleration we will see in the technology and in the opportunities that we have. We are among the companies, who is developing capabilities around trust and safety. We are very excited about that. Again, it is a data-driven initiative and a data-driven approach that we are taking that we believe will help companies along in that path.
Dana Buska: Okay, excellent. That sounds exciting. And when you’re looking at hiring these new recruiting costs, could you talk a little bit about hiring the new people? Could you talk a little bit about what type of positions you’re looking to hire?
Jack Abuhoff: So we’re hiring very broadly. We’re hiring people who have the ability to help us build the kinds of data sets that we require to train these models. A lot of the people that we’re hiring are language experts, or they’re domain experts, they’re linguists, they’re people with backgrounds that enable them to appreciate nuance and language. As we think about these models, it’s language that is used to program them. And the more fine-tuning we can do around the nuance of those language, the better performing the models are.
Dana Buska: Okay, great. And then one question about agility. It looked like it grew about 20% last quarter. Is that a growth rate that you are targeting for this year, or is that growth rate that you think is sustainable?
Jack Abuhoff: So, I think it was about 15%, if I’m not mistaken…
Aneesh Pendharkar: Dana was 17%.
Jack Abuhoff: 17%, yes. So, I think 15% to 20% is certainly what we would aspire to. We could do a lot more with more reinvestment in it, and that’s something that we would probably be considering at some point. But right now, we’re very focused on capital allocation relative to the service opportunities we have with large tech companies.
Dana Buska: Okay, excellent. It sounds very exciting, and thank you for taking my questions.
Jack Abuhoff: Thank you.
Operator: Thank you. The next question is coming from Bruce Galloway from Galloway Capital. Bruce, your line is live.
Bruce Galloway: Hey, Jack, congratulations. It seems like you’re gaining a lot of traction. Just to quantify the size of these contracts and some of the new additions. You mentioned the $23.5 million add on to the $20 million hyperscaler that you announced in April. I guess that was the one that was doing $23 million a year, that extended it for three years. So that alone is a $43.5 million add-on from the $23 million, which on a base of $86 million last year, that’s over 50% growth rate just from one customer. Can you give us some quantification on some of these other contracts, how big they could be and how much they can scale to?
Jack Abuhoff: Sure. So, as I mentioned, we’re now working with seven Big Tech companies, and we believe when we look across that portfolio, that they are all spending significantly, as significantly as the company that you referred to on these technologies. And we believe we have an opportunity to scale with all of them. Now, will all of them become as big as this one? That would be wonderful if it were to happen. I don’t know that it will, but we believe we’re certainly very well positioned. We brought in another two customers this quarter, which are very significant players and appear to be very eager to work with us. And in terms of when you start adding up the numbers and adding up what’s possible, we believe that the 40% growth target, which doubles what we provided last quarter, is still a reasonably conservative target. We have lots of opportunity in flight. We’re taking conservative view of our pipeline in order to support our growth guidance, and it’s a very exciting time in our business right now.
Bruce Galloway: So, looking at your current infrastructure, your personnel and your installed base, I mean, do you have the capability? I mean, adding $3 million in personnel and $3 million in CapEx seems a very modest amount for the scalability and the upside that you have with all these huge contracts coming through.
Jack Abuhoff: So the investments that I spoke about were investments in the recruiting costs associated with adding the headcount that is required for the near term contracts to get to that 40% growth and other investments that we’re making in sales, marketing and product development. I’m not including in those numbers the costs that we would be incurring to as cost of goods. But again, from a cost of goods perspective, I think you should look at it as an opportunity to drive in the services area, probably about 37% adjusted gross margin. And that would subsume the costs of producing the revenue.
Bruce Galloway: Okay. And also, as far as the cash generation, you generated, like $5.2 million in cash. Is that mostly from receivables? Or is it pure cash from operations or combination thereof?
Jack Abuhoff: Yes, I mean, it’s cash from operations that include receivables. So as we build, we then collect.
Bruce Galloway: Okay, great, thanks. Excellent job. Thanks.
Jack Abuhoff: Thank you.
Operator: Thank you. And that does conclude today’s Q&A session. I will now hand the call back to Jack Abuhoff for closing remarks.
Jack Abuhoff: Thank you. So, yes, we’re off to an exciting start for 2024, and my team and I are tremendously energized by what we believe we will accomplish in 2024. We grew revenue by 41% year-over-year. In Q1, we anticipate a substantial sequential revenue increase next quarter, and for 2024 overall, we have raised our guidance from 20% to at least 40% year-over-year expected organic only revenue growth. Today, we announced yet another substantial customer win, this one worth approximately $23.5 million of annualized revenue. This is on top of the win with the same customer we announced a couple of weeks ago, valued at approximately $20 million of annualized revenue. We also announced the signing of two additional Big Tech customers, both of which we believe will contribute to 2024 revenue growth longer term. In addition to expected or continued expected growth from our Big Tech customer base, we expect increased market demand from enterprise customers that we hope will continue to accelerate our growth trajectory. We’re super excited about the work that’s underway. We are laser focused on creating long term shareholder value, and thank you all for attending today’s call. We’ll be looking forward to our next call.
Operator: Thank you. This does conclude today’s conference. You may disconnect your lines at this time. Thank you for your participation.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.