AI can have a positive impact on the value of a business, but it can also have a negative impact if it is not disclosed properly to investors, when it comes to predicting the unpredictable.
In a recent example, the real estate company Zillow just experienced a rough Q4 in 2021. The company is now facing two class action shareholder lawsuits in federal courts over claims that it misrepresented the profitability of its home-flipping business, Zillow Offers. Zillow’s predictive algorithms proved no match for the volatility of the market. The company’s predictive promises also faced supply and labor hurdles.
Zillow is accused of violating the Securities and Exchange Act of 1934 by making “materially misleading” statements to investors. The Zillow i-buying business, Zillow Offers, gave sellers a way of flipping their homes without even putting the homes on the market. Zillow Offers bought the homes for cash.
The complaint alleges that Zillow made false and/or misleading statements and failed to disclose that, amongst other things, Zillow experienced significant unpredictability in forecasting home prices for its Zillow Offers business. This unpredictability, combined with labor and supply shortages, led to a backlog of inventory. The foregoing made it reasonably likely that the company would shutter its Zillow Offers business, which would have a material adverse impact on its financial outcomes. The news about Zillow caused shares of Zillow’s stock to fall, which impacted investors.
Is the fate of Zillow becoming more common for companies who rely on algorithms for their business model? Predictive algorithms, and algorithmic trading have been the focus of scrutiny by regulators recently. More and more companies, from small brokers to bigger investment firms, are falling foul of the erratic nature of algorithms.
In another example, towards the end of 2020, the SEC settled with BlueCrest for $170 million. The SEC found that BlueCrest violated the negligence related antifraud provisions of the Securities Act of 1933 and the Investment Advisers Act of 1940 when it failed to make adequate disclosure of an algorithm that was trading without the knowledge of its investors. One of the conclusions that can be gleaned from the SEC’s investigation of BlueCrest is that the SEC requires AI to be included in disclosures. The language of BlueCrest’s disclosures, in which the company referred to “quantitative strategies” was deemed to be inadequate to inform investors of the risk posed by nonhuman, AI based trading.
The IOSCO (International Organization of Securities Commissions) September 2021 report emphasized the measures regulators might take to ensure market intermediaries and asset managers are using predictive or other forms of AI responsibly. The report recommends that regulators require companies using AIs to undertake the following:
- Regular testing and monitoring of AI and ML techniques
- Designated senior management to be responsible for the oversight of ML and AI, including testing and monitoring
- Firms to have the knowledge and expertise to monitor and oversee AI and ML and conduct due diligence to ensure third parties have the expertise
- Firms to have clear contracts and agreements with third parties to monitor and oversee performance of AI and ML
- Firms to disclose the use of AI and MLs to clients and to outline the risks
- Firms to have access to adequate data such that biases don’t emerge in the algorithms.
Any company that uses predictive algorithms in its business model, whether to manage funds, to predict investment opportunities, or any other use that affects its clients or investors, needs to be mindful of the way regulators are taking in relation to these tools. Companies should consult experienced business litigators to ensure that any AI based strategies will stand up to scrutiny.
A responsible attorney with knowledge of how technology impacts legal obligations can help companies manage these risks from the outset. Data security attorneys and business litigation attorneys are working at the forefront of these new obligations. Algorithmic strategies can be implemented with great success, but they must be implemented with care and caution.
The Zillow example reveals the profound impact AIs can have on a business. In the BlueCrest example, BlueCrest faced accusations of not revealing its use of AIs. In the Zillow example, the company is being accused of failing to disclose how their overall strategy, which prominently featured predictive AIs, would have a negative impact on investors. In both cases, failure to monitor the performance of AIs and to disclose the risks of AIs as part of business strategy exposed the companies to liability.
This isn’t a problem that affects blue chip companies alone. Fund managers and advisers with a fiduciary duty need to be particularly mindful of any way AI intersects with their business, whether it’s through third party algorithms, or algorithms developed by the company.
For more information