As synthetic intelligence turns into extra embedded within the property sector, PropTech supplier Reapit is advising property and letting businesses to be cautious when utilizing or providing providers based mostly on so-called ‘black field’ AI — techniques that make choices with out clear or explainable logic.
Reapit says it’s dedicated to utilizing ‘explainable’ AI in its personal platforms and has raised issues concerning the rising reliance on opaque AI techniques, significantly these based mostly on deep studying fashions. These techniques might be troublesome, and even unimaginable, to totally perceive or audit — an issue that has implications for authorized compliance and buyer belief.
IBM and different expertise consultants have highlighted that some AI fashions are so advanced that their decision-making processes can’t be simply interpreted, even by their creators. In sensible phrases, this implies companies utilizing such techniques might not have the ability to justify how a selected resolution was made — a rising situation as AI instruments are used for duties like automated valuations, property listings, and tenant communication.
This lack of transparency carries authorized and operational dangers. Underneath the UK’s Information Use and Entry Act 2025 and current client safety rules, companies should inform people when AI is used to make important choices. Authorized steerage from Freshfields notes that folks have the precise to obtain a proof, request human evaluation, and problem choices — significantly in delicate areas resembling tenant functions or service complaints.
Moreover, the Digital Markets, Competitors and Customers Act 2024 locations stricter controls on advertising and marketing practices. If AI instruments produce deceptive property descriptions, altered photos, or inaccurate valuations, and this isn’t correctly disclosed or reviewed, businesses may face penalties.
Reapit’s warning comes at a time when the property business is quickly adopting AI-driven instruments. Businesses are inspired to make sure any AI used of their operations is clear, auditable, and compliant with present regulation.
A current case of AI-rendered property pictures prompted Sam Richardson, deputy editor of Shopper Journal to say “Discovering the precise residence to purchase or hire might be difficult sufficient with out having to fret about AI or edited photos. With residence consumers and renters probably needing to view a number of properties, this might waste their money and time travelling to viewings of properties that look nothing like they do on-line.”
Regardless of these high-profile circumstances, McKinsey forecasts AI adoption in actual property to develop by over 40% globally by 2026, with PropTech funding anticipated to exceed €10bn (£8.7bn) yearly. The agency says that success will rely on utilizing proprietary information, getting government buy-in, and establishing safeguards towards bias and hallucination.
“The query isn’t whether or not brokers will use AI, it’s whether or not they’ll use the precise AI,” stated Matt McGown, chief product officer at Reapit. “Generic instruments may save time, however they will additionally introduce threat. In case your AI can’t present the way it reached a call, or how a lot it edited a photograph, what data it used to draft a property description, or why it authorised a tenant or potential purchaser for a viewing, you’re risking fines and your hard-won popularity.”
In accordance with a Reapit survey of 624 UK property professionals in July 2025:
+ 62% imagine AI could make choices and study independently
+ 29% see AI primarily as automation
+ 79% have encountered instruments marketed as AI that have been in truth primary scripting