<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Technology &amp; Innovation Archives - Werksmans Attorneys</title>
	<atom:link href="https://werksmans.com/tag/technology-innovation/feed/" rel="self" type="application/rss+xml" />
	<link>https://werksmans.com/tag/technology-innovation/</link>
	<description>Corporate and Commercial Law Firm</description>
	<lastBuildDate>Fri, 27 Mar 2026 10:18:25 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>Global AI Governance Frameworks in a Diverging World</title>
		<link>https://werksmans.com/global-ai-governance-frameworks-in-a-diverging-world/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=global-ai-governance-frameworks-in-a-diverging-world</link>
					<comments>https://werksmans.com/global-ai-governance-frameworks-in-a-diverging-world/#respond</comments>
		
		<dc:creator><![CDATA[Ahmore Burger-Smidt]]></dc:creator>
		<pubDate>Fri, 27 Mar 2026 10:04:21 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Data Privacy]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://werksmans.com/?p=25434</guid>

					<description><![CDATA[<p>by Ahmore Burger-Smidt, Director and Head of Regulatory “The biggest lesson learned is we have to take the unintended consequences of any new technology along with all the benefits, and think about them simultaneously – as opposed to waiting for the unintended consequences to show up and then address them,”  Satya Nadella Microsoft CEO 2024 Artificial intelligence  [...]</p>
<p>The post <a href="https://werksmans.com/global-ai-governance-frameworks-in-a-diverging-world/">Global AI Governance Frameworks in a Diverging World</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><em>by Ahmore Burger-Smidt, Director and Head of Regulatory</em></p>
<p style="text-align: center;"><em>“The biggest lesson learned is we have to take the unintended consequences of any new technology along with all the benefits, and think about them simultaneously – as opposed to waiting for the unintended consequences to show up and then address them,” </em></p>
<p style="text-align: center;"><strong><em>Satya Nadella Microsoft CEO 2024</em></strong></p>
<p>Artificial intelligence governance has moved from theory to the board agenda. Organisations building or deploying AI across borders now face a mix of voluntary guidance and hard law. The practical question is no longer whether to adopt a framework, but which combination will withstand regulatory scrutiny, match the organisation’s risk profile, and work in day‑to‑day operations. There is no single winner.</p>
<p>When considering guidance and hard law when aiming to adopt an AI framework organisations ought to adopt a practical approach. For most, the defensible answer is layered: comply where the law is strict, and use one or more voluntary frameworks to structure governance, evidence good practice, and adapt as the landscape shifts.</p>
<p>The current touchstones are well known. The OECD AI Principles, refreshed in 2024, provide high-level, government-endorsed norms that cross borders. NIST’s AI Risk Management Framework offers operational scaffolding that integrates with enterprise risk programmes. IEEE’s Ethically Aligned Design provides engineers with granular guidance. Singapore’s Model AI Governance Framework is praised for its practical, proportionate implementation and sector-specific playbooks. ISO/IEC 42001 introduces the first certifiable AI management‑system standard. The G7 Hiroshima Process Code of Conduct points frontier developers towards safety testing and transparency.</p>
<p>Alongside these sits the outlier in legal effect: the EU AI Act, now in force with staged obligations through 2027 and backed by serious penalties.</p>
<p>Despite different origins, these instruments all speak to the same five principles: fairness, accountability, transparency, human oversight and safety.</p>
<p>Where they deviate is depth and enforceability. The OECD sets the tone, articulating non‑discrimination, respect for rights, transparency and robustness as shared values, while deliberately stopping short of prescribing how to implement them. NIST translates principles into processes through govern, map, measure, and manage functions, and tackles bias, explainability, and human judgement with concrete practices. IEEE dives into the technical detail, from dataset audits to fail‑safe design patterns. Singapore keeps the focus on outcomes, insisting on context-appropriate metrics, proportionate explanations, and the right-sized human involvement. ISO/IEC 42001 turns governance into an auditable discipline, requiring documented roles, risk treatment, oversight mechanisms and continual improvement. The G7 Code sets expectations for advanced models: pre-deployment testing, red-teaming, transparency reporting, and post-market monitoring.</p>
<p>Two frameworks, however, play a distinctive role in keeping governance human‑centred and proportionate. The OECD Principles begin with people, not systems. By anchoring AI to human rights, democratic values and the rule of law, they make human agency and dignity the standard for design choices, deployment contexts and routes to redress. They call for inclusive growth and non-discrimination, pushing organisations to ask who benefits, who is burdened, and whether affected communities can understand, contest, and influence AI-enabled decisions. Their take on transparency and explainability is purposeful: disclosure should be meaningful to users and those impacted, not a tick‑box. Because the OECD speaks in norms rather than checklists, it invites stakeholder engagement and reasoned judgement, keeping AI grounded in lived experience and social outcomes as technology evolves.</p>
<p>Singapore’s Model AI Governance Framework operationalises the same ethos through the principle of proportionality. It assumes that risk is contextual and that fairness, transparency and oversight must be calibrated to the impact of a given use case. It promotes explanations that are meaningful to their audience rather than generic templates, and it links the degree of human‑involvement in the loop to the stakes of the decision. Its sector guides, notably in financial services and healthcare, translate principles into practical steps that fit real operational environments. By encouraging continuous monitoring, targeted testing, and structured user feedback, it steers teams towards the right-sized controls that protect individuals while leaving room for innovation. For organisations at different stages of maturity, this approach avoids gold‑plating and reduces the risk that governance becomes paperwork detached from outcomes.</p>
<p>The EU AI Act differs from every other instrument here in both scope and enforceability. It is a binding law with extra‑territorial reach, applying to providers, deployers, importers and distributors that place AI systems on the EU market or whose systems affect people in the EU. It classifies AI uses by risk, prohibits certain practices outright, and imposes detailed, legally enforceable obligations on high-risk systems, including risk management, data governance, technical documentation, logging, human oversight, post-market monitoring, and incident reporting. It brings general-purpose AI into scope, layering transparency requirements and additional measures when models present systemic risks. Compliance is policed through conformity assessment, with meaningful fines for breaches. Much of the practical detail will be elaborated through harmonised standards and secondary measures over the next two years, but the direction is fixed: unlike voluntary frameworks, the Act creates duties, assigns liability and sets penalties. By contrast, the OECD Principles, NIST, IEEE, Singapore’s framework and the G7 Code are non‑binding; they shape expectations and practice but do not carry legal sanctions. Even ISO/IEC 42001, while certifiable and powerful in procurement and assurance, is not law and does not create any statutory defences on its own.</p>
<p>Choosing among frameworks, therefore, becomes a question of balance and fit. The OECD Principles provide legitimacy, a common language for boards and stakeholders, and a north star that keeps programmes human‑centred. Singapore supplies the day‑to‑day discipline of proportionality, helping teams define the use case, assess the risks, calibrate the controls, explain decisions in ways people can act on, and adjust as evidence accumulates. NIST offers the most detailed operational practices to make those choices repeatable across the lifecycle. ISO/IEC 42001 turns them into a verifiable management system that regulators, customers and investors can trust. IEEE hardens the engineering spine. The EU AI Act sets the hard floor where it applies; its obligations should be built in from the outset, not layered on at the end.</p>
<p>For cross‑border organisations, interoperability is the key to defensibility. An ISO-style management system can serve as the backbone, integrating NIST processes and Singapore’s proportional controls while remaining anchored to the OECD’s human-centred norms and mapped to the EU AI Act where relevant.</p>
<p>What does this mean for organisations adopting AI tools, engaging in big data analytics and creating value? The destination is not perfection but defensibility. In a world of regulatory divergence, the strongest posture is a coherent, documented and adaptable governance architecture that shows your work: why you chose the frameworks you did, how they map to your risks and markets, and how you are improving over time.</p>
<p>Principle and prescription are not alternatives; they are the twin rails that keep AI governance both human‑centred and proportionate, and, where the EU AI Act applies, lawful.</p>
<p>Be proactive. Identify unintended consequences and reap the benefits of AI adoption.</p>
<p>&nbsp;</p>
<p>The post <a href="https://werksmans.com/global-ai-governance-frameworks-in-a-diverging-world/">Global AI Governance Frameworks in a Diverging World</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://werksmans.com/global-ai-governance-frameworks-in-a-diverging-world/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Cracking Down or Catching Up? South Africa’s Approach to Crypto Regulation: Part 3 &#8211; Exchange Control</title>
		<link>https://werksmans.com/cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-3-exchange-control/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-3-exchange-control</link>
		
		<dc:creator><![CDATA[Armand Swart]]></dc:creator>
		<pubDate>Tue, 28 Oct 2025 12:23:13 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://werksmans.com/?p=24488</guid>

					<description><![CDATA[<p>by Armand Swart - Director - Deon Griessel, Hilah Laskov - Director and Hlonelwa Lutuli - Associate  Introduction Crypto assets ("crypto") exist in a unique regulatory space. Unlike traditional currency, crypto is not issued by central banks. Crypto can however be used in similar ways to traditional currency: it can be traded, used for payments,  [...]</p>
<p>The post <a href="https://werksmans.com/cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-3-exchange-control/">Cracking Down or Catching Up? South Africa’s Approach to Crypto Regulation: Part 3 &#8211; Exchange Control</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>by <em>Armand Swart &#8211; Director &#8211; Deon Griessel, Hilah Laskov &#8211; Director and Hlonelwa Lutuli &#8211; Associate </em></p>
<p><strong>Introduction </strong></p>
<p>Crypto assets (&#8220;<strong>crypto</strong>&#8220;) exist in a unique regulatory space. Unlike traditional currency, crypto is not issued by central banks. Crypto can however be used in similar ways to traditional currency: it can be traded, used for payments, investing, security, or capital raising.</p>
<p>No single law governs crypto in South Africa. Instead, regulation is fragmented across various laws, demanding that organisations understand this legal landscape.</p>
<p>This article is the third in a series on crypto regulation. The first article, available <a href="https://werksmans.com/cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-1-payments/">here</a>, delved into the regulation of crypto as a form of payment. The second article, available <a href="https://werksmans.com/cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-2-financial-services-and-fica/">here</a>, considered the impact of financial services laws as well as anti-money laundering and counter terrorism laws on crypto regulation. In this article, we map exchange control considerations for crypto asset services providers (&#8220;<strong>CASPs</strong>&#8220;).</p>
<p><strong>How does crypto fit into the Exchange Control Regulations?</strong></p>
<p>The Exchange Control Regulations, 1961 (the &#8220;<strong>Regulations</strong>&#8221; and each a &#8220;<strong>Regulation</strong>&#8220;) are amongst others concerned with restricting the transfer of funds and financial capital assets that are held in South Africa, out of South Africa. More specifically in this context, (i) Regulation 3(1)(c) provides that a South African resident may not pay a non-resident without exchange control approval (the &#8220;<strong>Currency Payment Rule</strong>&#8220;); and (ii) Regulation 10(1)(c) prohibits the export of capital without exchange control approval (the &#8220;<strong>Capital Export Rule</strong>&#8220;).</p>
<p>Crypto can readily be transferred from a digital wallet on a South African crypto exchange, to a foreign crypto exchange (a &#8220;<strong>Cross-Border Crypto Transfer</strong>&#8220;). This poses the question, is the crypto so transferred &#8220;<em>capital</em>&#8221; and therefore subject to the Capital Export Rule; and / or is it &#8220;<em>currency</em>&#8221; and therefore subject to the Currency Payment Rule? If the answer to either is yes, a Cross-Border Crypto Transfer will require exchange control approval.</p>
<p><strong>Latest developments in crypto exchange control treatment</strong></p>
<p>The aforementioned question was dealt with in the decision of the High Court in <em>The Standard Bank of South Arica v The South African Reserve Bank (SARB)</em> <em>&amp; Others</em><em> (047643/2023) [2025] ZAGPPHC 481 (15 May 2025) </em>(the &#8220;<strong><em>SBSA </em>decision</strong>&#8220;). The case concerned the validity of a forfeiture order issued by the South African Reserve Bank (&#8220;<strong>SARB</strong>&#8220;) in respect of funds held by the applicant, The Standard Bank of South Africa (&#8220;<strong>SBSA</strong>&#8220;), and the sixth respondent, Nedbank Limited, but belonging to a company called Leo Cash and Carry Proprietary Limited (&#8220;<strong>LLC</strong>&#8220;). The SARB issued the forfeiture order after LLC had transferred 4,405.9783 Bitcoin to the value of R556 million out of South Africa to a Seychelles-based crypto exchange, which constituted a Cross-Border Crypto Transfer.</p>
<p>The central question in the matter before the court was whether the Cross-Border Crypto Transfer contravened the Currency Payment Rule and / or the Capital Export Rule, i.e. was LLC required to obtain exchange control approval for the transfer?</p>
<p><strong>Does the Currency Payment Rule apply to a Cross-Border Crypto Transfer? </strong></p>
<p>The court rejected the SARB&#8217;s argument that crypto was &#8220;<em>currency</em>&#8221; for purposes of the Currency Payment Rule, deeming such a construction &#8220;<em>strained and impractical</em>&#8220;. The court recognised that crypto was not legal tender in South Africa and that considering crypto to be currency presented practical challenges, such as determining if it must be declared when entering or leaving South Africa. The court noted the global nature of crypto as &#8220;<em>codes on a digital ledger</em>&#8221; that exist anywhere and everywhere. This is different from the types of currency that is covered by the Currency Payment Rule, like securities and bank notes.</p>
<p>Given the punitive nature of the Regulations and its view that crypto is not &#8220;<em>currency</em>&#8220;, the court found &#8220;<em>no room for an unnatural and fictitious</em>&#8221; interpretation namely that the Currency Payment Rule should apply to a Cross-Border Crypto Transfer.</p>
<p><strong>Does the Capital Export Rule apply to a Cross-Border Crypto Transfer? </strong></p>
<p>The court held that on any interpretation of the Regulations &#8211; much less a restrictive one &#8211; there is no room for regarding crypto to be &#8220;<em>capital</em>&#8221; for purposes of the Capital Export Rule. The court agreed with SBSA that a regulatory framework addressing crypto is long overdue:  crypto has been in existence for over 15 years and the SARB has to date taken no steps to regulate it. Nevertheless, the court could not usurp the functions of the legislature in that regard.</p>
<p>In its interpretation of the Capital Export Rule, the court referred to the decision of the Supreme Court of Appeal in <em>Oilwell (Pty) Ltd v Protec International Ltd and Others</em> 2011 (4) SA 394 (SCA) (&#8220;<strong><em>Oilwell</em></strong>&#8220;). In <em>Oilwell</em>, the court applied a restrictive interpretation to the Capital Export Rule, finding that intellectual property (&#8220;<strong>IP</strong>&#8220;) rights were not &#8220;<em>capital</em>&#8220;, and exchange control approval was therefore not required for its transfer out of South Africa. Following the <em>Oilwell</em> decision, the exchange control authorities swiftly responded by amending the Capital Export Rule so that the transfer of IP rights now explicitly fall within the ambit of the Capital Export Rule.</p>
<p><strong>The impact of the <em>SBSA </em>decision</strong></p>
<p>The court concluded that LCC, in making the Cross-Border Crypto Transfer/s, did not contravene either the Currency Payment Rule nor the Capital Export Rule, and it set aside the SARB&#8217;s forfeiture order.</p>
<p>We were informed by the SARB that it has been granted leave to appeal the <em>SBSA </em>decision directly to the SCA, suspending the High Court decision pending the outcome of the appeal. It therefore remains to be seen whether the decision and reasoning of the High Court will be upheld or struck down. As of the date of this article, the date of the SCA appeal has yet to be set.</p>
<p>The <em>SBSA</em> decision highlighted a major gap in the Regulations as regards crypto. It is therefore possible that the Regulations will be amended such that the Currency Payment Rule and / or the Capital Export Rule apply to a Cross-Border Crypto Transfer, similar to what happened regarding IP rights following the <em>Oilwell</em> decision.</p>
<p><strong>Bringing clarity to crypto: final thoughts </strong></p>
<p>South Africa has taken significant steps to regulate crypto by bringing CASPs under both AML (FICA) and financial services (FAIS) frameworks, establishing compliance requirements for CASPs. Crypto is not recognised as legal tender in South Africa, although more businesses are beginning to accept crypto as a form of payment. Despite this, the most significant <em>lacunae</em> in the crypto regulatory landscape is in relation to exchange control considerations as highlighted in this article.</p>
<p>As the regulatory framework continues to evolve, crypto businesses must stay informed of their compliance obligations and the accompanying risks.</p>
<p>For assistance with your crypto needs, feel free to contact a member of our team.</p>
<p>The post <a href="https://werksmans.com/cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-3-exchange-control/">Cracking Down or Catching Up? South Africa’s Approach to Crypto Regulation: Part 3 &#8211; Exchange Control</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Cracking Down or Catching Up? South Africa’s Approach to Crypto Regulation: Part 1 – Payments</title>
		<link>https://werksmans.com/cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-1-payments/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-1-payments</link>
		
		<dc:creator><![CDATA[Armand Swart]]></dc:creator>
		<pubDate>Wed, 02 Jul 2025 12:55:34 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://www.werksmans.online/?p=23488</guid>

					<description><![CDATA[<p>by Armand Swart, Director, Hilah Laskov, Director and Hlonelwa Lutuli, Associate Introduction Crypto assets (“crypto“) exist in a unique regulatory space. Unlike traditional currency, crypto is not issued by central banks. Crypto can however be used in similar ways to traditional currency: they can be traded, used for payments, investing, security or capital raising. No  [...]</p>
<p>The post <a href="https://werksmans.com/cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-1-payments/">Cracking Down or Catching Up? South Africa’s Approach to Crypto Regulation: Part 1 – Payments</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><em>by Armand Swart, Director, Hilah Laskov, Director and Hlonelwa Lutuli, Associate</em></p>
<p><strong>Introduction</strong></p>
<p>Crypto assets (“<strong>crypto</strong>“) exist in a unique regulatory space. Unlike traditional currency, crypto is not issued by central banks. Crypto can however be used in similar ways to traditional currency: they can be traded, used for payments, investing, security or capital raising.</p>
<p>No single law governs crypto in South Africa. Instead, regulation is fragmented across various laws, demanding that organisations understand this legal landscape.</p>
<p>This article is the first in a series on crypto regulation, and it maps the current regulation of crypto relating to payments. In the articles that follow, we will also discuss the regulation of crypto through anti-money laundering and counter terrorism laws, financial services laws, and exchange control.</p>
<p><strong>Promise and peril: crypto assets as a form of payment</strong></p>
<p>Many business want to know if they can accept crypto as a form of payment and what the related risks and processes are if they choose to do so. In this section, we focus on the use of crypto for retail payments. The emergence of crypto has provided an alternative means of payment to traditional low value payment systems (“<strong>PS</strong>“) that use government-issued FIAT currency (Rands in the case of South Africa), such as electronic funds transfers, debit card, credit card, real time clearing, and even cash.</p>
<p>The South African Reserve Bank (“<strong>SARB</strong>“) – who regulates and enforces South Africa’s national payment system (“<strong>NPS</strong>“) – does not however presently recognise crypto as a form of currency; crypto is neither “money” issued by the SARB; nor is it “e-money”, as only registered South African banks are able to issue e-money. This means that crypto as a means of payment is unregulated.</p>
<p><strong>Risks and benefits of crypto as a form of payment</strong></p>
<p>Using crypto as a means of payment has both risks and benefits. Crypto makes use of the distributed ledger technology (“<strong>DLT</strong>“) to verify and record transactions between crypto wallets. A crypto payment is therefore completely “peer-to-peer” between the payer (the customer) and payee’s (the retailer) crypto wallets. This is in contrast to a FIAT payment which needs to be (i) “cleared” (i.e. the exchange of payment instructions between banks); and (ii) “settled” (i.e. the discharge of payment obligations between banks in central bank assets). A traditional payment therefore requires a multitude of parties to function: this includes the issuing / payor bank and the acquiring / payee bank, but it also includes the other role players in the NPS, including the payment system operator (STRATE, BankServ, Visa, Mastercard), and the South African Multiple Option Settlement (“<strong>SAMOS</strong>“) operated by SARB. It may also include other payment services providers like system operators or third party payment providers if  those are utilised.</p>
<p>The immutability of DLT means that crypto transactions are final and irreversible and can only be refunded by the receiving party. If a customer makes a mistake or their crypto wallet is used fraudulently by someone else, they will have no right of recourse. This is unlike card transactions which can be challenged on the basis of fraud and a charge back can be requested.</p>
<p>The lack of regulation and the fact that there are no intermediaries for crypto payments greatly reduces the costs associated with the conventional banking system. The converse is that this can create risks for users. The role players in the NPS referred to above are highly regulated and must comply with the requirements of the NPS, including applicable laws, directives issues by the SARB, the Payment Association of South Africa’s Constitution, card issuer rules, and other documents. This is not the case for crypto which is not issued by a central agency (like the SARB); and whereas crypto payments are otherwise unregulated. What this also means is that the usual charge back procedures, fraud prevention measures, and dispute resolution processes are not applicable to crypto payments, resulting in a lesser degree of consumer protection.</p>
<p>One of the main hallmarks of a traditional payment system is interoperability: different banks and their systems, applications, and infrastructure are able to communicate with each other. To put this in simple terms: even if I bank with FNB and a merchant banks with Standard Bank, by using the NPS, I am able to transact using my FNB card at a merchant and that transaction is cleared and settled between the banks in terms of the NPS.</p>
<p>Crypto eliminates single points of failure in the payment instruction and transaction itself: this makes intermediary error or failure impossible. Whereas traditional PSs include a risk of fraud or a role player in the payment chain making an error. Crypto however still carries with it a risk with regards to a person’s digital wallet as it stores their private keys. Crypto can also still be used for other fraudulent purposes like money laundering and financing terrorism.</p>
<p><strong>Payment processes</strong></p>
<p>Crypto is not a currency and it is not currently part of the NPS: it functions on a separate system: the blockchain. With crypto payments, a retailer has to accept the specific crypto coin (e.g. Bitcoin or Ethereum) that the consumer wants to use, and it must have its own crypto wallet with a crypto address that may be used for the transfer of that specific crypto coin. Pick ‘n Pay – for example – accepts payment in Bitcoin.</p>
<p>There is however an increasing popular alternative to accepting crypto directly, and this is where crypto is accepted through the use of a crypto wallet partnered with a mobile payment app or payment gateway / platform. The most widely used example of this is Luno’s partnership with Zapper that allow crypto payments at over 31,000 Zapper merchants in the country, including Dischem and FlySafair.</p>
<p>How this works is that a customer using Luno can pay a merchant using crypto. Luno receives the cryptocurrency payment from the customer, and itthen converts the crypto to the merchant’s chosen fiat currency (e.g. ZAR) and settles the merchant’s account. Zapper can be used for in-store and online purchases. This feature is particularly useful for merchants who want to accept crypto payments without dealing with the complexities of managing and converting the crypto themselves. It also helps to mitigate the volatility associated with crypto.</p>
<p>Luno has also partnered with Stich Pay to offer a similar service for online payments. VALR, ACT Pay, Fivewest, and Peach Payments are some examples of further payment gateways that allow payments using crypto in a similar manner for online payments. Keep in mind however that payment apps and gateways charge transaction fees, and by their nature add an additional intermediary between the customer and merchant.</p>
<p><strong>Concluding remarks on crypto payments</strong></p>
<p>Retailers are free to implement crypto as a form of payment. This is however done at their own risk and that of their consumers: any such use will not be subject to the usual protections of the NPS and the oversight of SARB. Although developments in the market mean that it is easier than ever for retailers to accept crypto as a means of payment.</p>
<p>While crypto is not regulated as part of the NPS, they are regulated by other means, primarily through the regulation of the institutions that provide <em>services</em> in relation to crypto i.e. crypto asset services providers or CASPS. We will discuss how this regulation works in the articles that follow in this series.</p>
<p>For assistance with your crypto needs, contact a member of our team.</p>
<p>_________________________</p>
<p>Read more about our <a href="https://werksmans.com/practices/regulatory/">Regulatory</a> practice area.</p>
<p>The post <a href="https://werksmans.com/cracking-down-or-catching-up-south-africas-approach-to-crypto-regulation-part-1-payments/">Cracking Down or Catching Up? South Africa’s Approach to Crypto Regulation: Part 1 – Payments</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Technology &#038; AI  &#8211; in the workplace and beyond</title>
		<link>https://werksmans.com/technology-ai-in-the-workplace-and-beyond/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=technology-ai-in-the-workplace-and-beyond</link>
		
		<dc:creator><![CDATA[Preeta Bhagattjee]]></dc:creator>
		<pubDate>Wed, 26 Feb 2025 00:00:00 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://www.werksmans.online/technology-ai-in-the-workplace-and-beyond/</guid>

					<description><![CDATA[<p>by Preeta Bhagattjee, Director and Head of Technology &amp; Innovation &amp; Bradley Workman-Davies, Director   The rapid integration and adoption of artificial intelligence (AI) across various industries and sectors is transforming the way businesses operate, driving efficiency, innovation and cost savings and with its prolific progress, AI is reshaping the workplace at an unprecedented pace,  [...]</p>
<p>The post <a href="https://werksmans.com/technology-ai-in-the-workplace-and-beyond/">Technology &amp; AI  &#8211; in the workplace and beyond</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><em>by Preeta Bhagattjee, Director and Head of Technology &amp; Innovation &amp; Bradley Workman-Davies</em>, Director</p>



<p>&nbsp;</p>



<p>The rapid integration and adoption of artificial intelligence (AI) across various industries and sectors is transforming the way businesses operate, driving efficiency, innovation and cost savings and with its prolific progress, AI is reshaping the workplace at an unprecedented pace, both from streamlining administrative tasks to optimising complex processes. However, as organisations increasingly prioritise AI investments over expanding their human workforce, concerns are mounting about the potential for workforce reductions. This shift raises critical questions about the balance between technological progress and maintaining employment opportunities, as businesses navigate the challenges and ethical considerations of a future where machines may replace a significant portion of human labour.</p>



<p>As examples of the rapidly changing face of the workplace, a survey by KPMG revealed that 68% of business leaders are under investor pressure to demonstrate returns on AI investments and this raises concerns about potential unemployment impacts and workforce reductions, particularly among white-collar workers. In the UK one survey suggests that 51% of businesses plan to prioritise AI investment over hiring new staff, influenced by rising employment costs and economic pressures where cost-cutting measures may be achieved using technology.</p>



<p>All is not doom and gloom however as there is a growing demand for AI-related skills in the job market.  At the same time, there is an acknowledged drive to &#8220;do more with existing infrastructure&#8221; while maintaining existing staffing levels.  Some companies are using AI technologies to grow their business reach by, for example, integrating AP solutions into their services and product offerings. Another example is how generative A is transforming the consulting industry by reducing employee workloads and increasing efficiency, where many have developed AI tools to automate tasks such as email drafting and data formatting.  Target CEO Brian Cornell stated that AI would create more jobs rather than replace them, urging people to embrace technological advancements. In the US, Target plans to open 300 new stores over the next decade, focusing on enhancing customer experience with services like same-day delivery, supported by AI-driven efficiencies.</p>



<p>Although there may be shifts in the labour landscape, the introduction of AI into the workplace should not necessarily be seen as the death knell of employment, and rather as a re-alignment of employee functions and priorities, to enhance and capitalise on the productivity of functions that only employees can perform.  The labour legislative environment in South Africa, built around pivotal statutes such as the Labour Relations Act, the Basic Conditions of Employment Act, and the Employment Equity Act, have more than proven themselves to be adaptive and flexible enough to accommodate changing workplace requirements.</p>



<p>Provided that employers consider the various opportunities provided by AI to innovate, enhance service delivery and focus employees on higher level tasks not suited for AI while freeing them from the tedium of the more administrative parts of their workload, the future workplace could be a transformative and positive space. This will continue to be a balancing act, especially as technologies advance at breakneck speed. We have only just gotten our heads around AI&#8217;s broader impact and utility and we already have to contend with preparing for future technologies such as quantum computing and in particular, quantum computing with AI (Quantum AI) which when readily available will have the capability to use artificial intelligence for unprecedented complex computation. Again, this technology could have many benefits for transforming workplaces and creating dynamic workplace efficiencies and improving productivity as well as creating many new job roles in this space. Businesses will continue to have to balance use of such technologies for its business with the potential negative impacts and risks that come with using such technology, on data privacy, data management and cybersecurity as well as ensuring that the business remains compliant with all laws that apply to it, all of which for the time being, at least, means that human workers will not be replaced by robots in the workplace &#8211; as they are vital maintain that this balance remains in place.</p>
<p>The post <a href="https://werksmans.com/technology-ai-in-the-workplace-and-beyond/">Technology &amp; AI  &#8211; in the workplace and beyond</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The AI National Policy: South Africa&#8217;s initial step to establish an AI policy and regulatory framework</title>
		<link>https://werksmans.com/the-ai-national-policy-south-africas-initial-step-to-establish-an-ai-policy-and-regulatory-framework/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=the-ai-national-policy-south-africas-initial-step-to-establish-an-ai-policy-and-regulatory-framework</link>
		
		<dc:creator><![CDATA[Preeta Bhagattjee]]></dc:creator>
		<pubDate>Wed, 10 Apr 2024 00:00:00 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://www.werksmans.online/the-ai-national-policy-south-africas-initial-step-to-establish-an-ai-policy-and-regulatory-framework/</guid>

					<description><![CDATA[<p>At the AI Government Summit on 5 April 2024, the Department of Communications and Digital Technologies ("DCDT") launched South Africa's Artificial Intelligence ("AI") Planning Discussion Document ("Discussion Document") with the purpose of the Discussion Document being to initiate discussion between the public and private sector to facilitate AI innovation and along with that, the development  [...]</p>
<p>The post <a href="https://werksmans.com/the-ai-national-policy-south-africas-initial-step-to-establish-an-ai-policy-and-regulatory-framework/">The AI National Policy: South Africa&#8217;s initial step to establish an AI policy and regulatory framework</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p></p>



<p>At the AI Government Summit on 5 April 2024, the Department of Communications and Digital Technologies (&#8220;<strong>DCDT</strong>&#8220;) launched South Africa&#8217;s Artificial Intelligence (&#8220;<strong>AI</strong>&#8220;) Planning Discussion Document (&#8220;<strong>Discussion Document</strong>&#8220;) with the purpose of the Discussion Document being to initiate discussion between the public and private sector to facilitate AI innovation and along with that, the development of a national AI policy that will set out the government&#8217;s position on the adoption of AI in South Africa, government-led AI initiatives and a proposed regulatory framework and/or principles guiding the use and development of AI in South Africa.</p>



<p>This is a positive initial step, as once a national policy is adopted as a White Paper, it would inform the legal framework and principles upon which the appropriate governance or regulatory instruments or any new legislation to regulate AI (which the Discussion Document seems to propose, including by providing various proposed dates by when specific AI &#8216;regulations&#8217; should be passed) is likely to be based.</p>



<p>The DCDT working together with an AI Expert Advisory Council (to be peopled with AI experts) will work to define the AI strategy to be embodied in the national AI policy. The DCDT proposes that the Artificial Intelligence Institute of South Africa shall guide the DCDT in its implementation and development of the national AI policy and any AI specific regulation. Whilst the approach to the regulation of AI is not conclusively set out in the Discussion Document, it seems that the DCDT intends to develop and implement some form of AI regulation (as alluded to in the keynote address by the Minister of DCDT during the launch of the Discussion Document at the AI Government Summit).</p>



<p>Even though the development of national AI policy through the publication of the Discussion Document is in the early stages, it provides some initial insights into the government&#8217;s proposed approach to addressing the adoption of AI in South Africa. Some of the key proposals are:</p>



<ul class="wp-block-list">
<li>a national strategy for AI to advance economic growth, with a particular focus on key sectors, including manufacturing, agriculture, military capabilities, energy transition, healthcare and the automotive sector;</li>



<li>developing and implementing a tailored regulatory approach to AI in South Africa over the period of 2025 to 2027, which can take guidance from other legal systems and approaches adopted globally, for example the European Union&#8217;s AI Act, but should ultimately address the South African context and infrastructure capabilities;</li>



<li>ensuring that ethical considerations relating to AI use are appropriately addressed under the legal framework which is ultimately adopted or is included as guiding principles for the national AI policy &#8211; to guard against and mitigate any harms resulting from the deployment of AI in South Africa. Some of the harms to be addressed include:<ul><li>anti-competitive behaviour arising from the concentration of AI models between a small number of technology players;</li></ul><ul><li>risks associated with robotic or autonomous devices that use AI to make decisions;</li></ul><ul><li>social risk arising from job losses;</li></ul><ul><li>existential risks resulting from loss of control of AI models that pursue goals detrimental to humanity; and</li></ul>
<ul class="wp-block-list">
<li>risks of enhanced criminal behaviour or other dangerous outcomes resulting from AI designed for military purposes (used without appropriate controls); and</li>
</ul>
</li>



<li>the need for investment by government to implement targeted initiatives, such as:<ul><li>ensuring that education and training ecosystems suitably cater for AI literacy skills to address future AI skills requirements;</li></ul><ul><li>developing or investing in locally developed AI solutions, including through supporting and investing in tech start-ups in South Africa; and</li></ul>
<ul class="wp-block-list">
<li>upgrading and maintaining the public data architecture systems across key identified sectors and to mobilise e-government services utilising AI.</li>
</ul>
</li>
</ul>



<p>Critically, the Discussion Document appears to have been published as a rough draft. It is repetitive, has conflicting provisions and is not sufficiently advanced, specific or practical in clarifying and setting a clear policy approach and informed plan to offer meaningful guidance on the way forward. It is largely styled as a list of options, initial thoughts and information from other sources and recordals of developments in AI use and governance and regulation in other countries including, for example, suggesting that the persons charged with regulating or creating AI governance frameworks or laws &#8220;<em>should consider disclosure requirements around the use of AI techniques</em>&#8220;.</p>



<p>Optimistically the Discussion Document does signal that policy and legislative developments for the use and adoption of AI in South Africa are on the horizon and it is hoped that a reworked Discussion Document is published which is a substantive AI policy plan and includes considered input from key stakeholders from the private and public sectors and relevant AI bodies, such as the AI Expert Advisory Council, and provides specific direction for South Africa, taking into account the learnings from other jurisdictions around the globe, including from a regulatory perspective and in implementing effective mechanisms to foster and encourage the development and use of AI while striking a balance to manage the potential harms and ethical risks which have arisen with the proliferation of AI in all aspects of business and society.</p>
<p>The post <a href="https://werksmans.com/the-ai-national-policy-south-africas-initial-step-to-establish-an-ai-policy-and-regulatory-framework/">The AI National Policy: South Africa&#8217;s initial step to establish an AI policy and regulatory framework</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>International: Trends in AI governance</title>
		<link>https://werksmans.com/international-trends-in-ai-governance/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=international-trends-in-ai-governance</link>
		
		<dc:creator><![CDATA[Preeta Bhagattjee]]></dc:creator>
		<pubDate>Wed, 10 Jan 2024 00:00:00 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://www.werksmans.online/international-trends-in-ai-governance/</guid>

					<description><![CDATA[<p>READ - INTERNATIONAL: TRENDS IN AI GOVERNANCE We're thrilled to share an insightful article featured in OneTrust DataGuidance on the international trends in AI governance by Preeta Bhagattjee, Director and Head of the Technology &amp; Innovation practice.In this article, Preeta explores the latest developments in AI regulation and the various approaches taken by countries leading  [...]</p>
<p>The post <a href="https://werksmans.com/international-trends-in-ai-governance/">International: Trends in AI governance</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p></p>



<div class="wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex">
<div class="wp-block-button is-style-fill"><a class="wp-block-button__link has-black-background-color has-background wp-element-button" href="https://www.dataguidance.com/opinion/international-trends-ai-governance">READ &#8211; <span style="text-decoration: underline;">INTERNATIONAL: TRENDS IN AI GOVERNANCE</span></a></div>
</div>



<p></p>



<p>We&#8217;re thrilled to share an insightful article featured in OneTrust DataGuidance on the international trends in AI governance by <a href="https://www.linkedin.com/in/ACoAAAAAh4gBykIIU8xhdflUQbyOm4_Xj8GzAH8"></a>Preeta Bhagattjee, Director and Head of the Technology &amp; Innovation practice.<br><br>In this article, Preeta explores the latest developments in AI regulation and the various approaches taken by countries leading AI innovation. Discover the market-driven approach adopted by the US and the UK, as well as the consumer protection-driven strategy of the EU.  <br><br>Gain valuable insights into the legal and ethical implications of AI systems and the critical need for responsible governance.</p>



<div class="wp-block-buttons is-vertical is-layout-flex wp-container-core-buttons-is-layout-8cf370e7 wp-block-buttons-is-layout-flex">
<div class="wp-block-button"><a class="wp-block-button__link has-black-background-color has-background wp-element-button" href="https://www.dataguidance.com/opinion/international-trends-ai-governance">READ &#8211; <span style="text-decoration: underline;">INTERNATIONAL: TRENDS IN AI GOVERNANCE</span></a></div>
</div>
<p>The post <a href="https://werksmans.com/international-trends-in-ai-governance/">International: Trends in AI governance</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>AI Laws &#8211; The Year in Review</title>
		<link>https://werksmans.com/ai-laws-the-year-in-review/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=ai-laws-the-year-in-review</link>
		
		<dc:creator><![CDATA[Preeta Bhagattjee]]></dc:creator>
		<pubDate>Tue, 12 Dec 2023 00:00:00 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://www.werksmans.online/ai-laws-the-year-in-review/</guid>

					<description><![CDATA[<p>DOWNLOAD AI LAWS - THE YEAR IN REVIEW PDF HERE Almost synonymous with 2023 is the term Artificial Intelligence (AI) and on a global basis, this year has seen significant debate on and development in the area of regulation of AI. This is against AI’s increased accessibility and the unprecedented speed of AI development with  [...]</p>
<p>The post <a href="https://werksmans.com/ai-laws-the-year-in-review/">AI Laws &#8211; The Year in Review</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p></p>



<div class="wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex">
<div class="wp-block-button"><a class="wp-block-button__link wp-element-button" href="https://werksmans.com/wp-content/uploads/2023/12/Werksmans-AI-Regulation-Global-Snapshot-.pdf"><strong>DOWNLOAD AI LAWS &#8211; THE YEAR IN REVIEW PDF HERE</strong></a></div>
</div>



<p></p>



<p> </p>



<p>Almost synonymous with 2023 is the term Artificial Intelligence (AI) and on a global basis, this year has seen significant debate on and development in the area of regulation of AI. This is against AI’s increased accessibility and the unprecedented speed of AI development with the emergence of new use cases such as generative AI and multiple applications in almost every industry and sector.</p>



<p>Regulation has become a global concern where the need for ethical and responsible AI is critical. As a result, many regional developments have occurred. As far back as 2019, the Organisation for Economic Co-operation and Development’s (“OECD”) issued AI Principles (2019) with recommendations for member states’ policy to include in their national AI policies. In November 2023, in Africa, the African Union is developing a Conceptual Framework of the Continental Strategy on Artificial Intelligence to address principles and strategic objectives for the responsible and safe use of AI. The Bletchley Declaration on AI was signed by 27 countries in attendance at the recent AI Safety Summit acknowledging global opportunities and risks of AI. </p>



<p class="has-text-align-left">While the debate on if and how to regulate AI continues, many countries are in the process of developing or implementing laws and regulations to govern the use and impact of AI. Different approaches to regulation have emerged where some regulation takes the form of formal legislation &#8211; regulating what AI may be developed and how and providing for sanctions for offenders. An example is the EU, where the three branches of the EU government &#8211; the European Parliament, Council and Commission &#8211; recently announced their provisional agreement on the Draft EU AI Act, bringing us one step closer to the adoption of the world’s first comprehensive AI law. Some countries have adopted a more flexible, innovation friendly soft law approach of providing for AI ethical use frameworks and/or sector specific guidelines, such as in the UK. Some regulators/governments, such as the United Arab Emirates, have taken a phased approach of interim soft law options while they navigate the many legal and regulatory unknowns accompanying AI, favouring an approach where formal legislation may be enacted further down the line. </p>



<p class="has-text-align-left">This AI Global Snapshot provides a high-level overview of AI laws, regulations and guidelines and frameworks which have emerged in the past year.</p>
<p>The post <a href="https://werksmans.com/ai-laws-the-year-in-review/">AI Laws &#8211; The Year in Review</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Liability for defamation by AI</title>
		<link>https://werksmans.com/liability-for-defamation-by-ai/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=liability-for-defamation-by-ai</link>
		
		<dc:creator><![CDATA[Preeta Bhagattjee]]></dc:creator>
		<pubDate>Wed, 01 Nov 2023 00:00:00 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Disputes]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://www.werksmans.online/liability-for-defamation-by-ai/</guid>

					<description><![CDATA[<p>Generative AI has exploded into the public consciousness and into widespread use with the emergence of language processing tools (or large language models (LLMs)) such as ChatGPT. The objective is to mimic human-generated content so precisely that the artificially generated content can be indistinguishable. This is achieved by assimilating and analysing original content on which  [...]</p>
<p>The post <a href="https://werksmans.com/liability-for-defamation-by-ai/">Liability for defamation by AI</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p></p>


<p>Generative AI has exploded into the public consciousness and into widespread use with the emergence of language processing tools (or large language models (LLMs)) such as ChatGPT. The objective is to mimic human-generated content so precisely that the artificially generated content can be indistinguishable.</p>
<p>This is achieved by assimilating and analysing original content on which the tool has been trained, as supplemented by further learnings from prompts and its own generated content &#8211; essentially by learning patterns and relationships between words and phrases in natural language to repetitively predict the likeliest next word in a string based on what it has already seen and continues these predictions until its answer is complete[1].</p>
<p>A curious feature of LLMs is that they sometimes produce false and even damaging output. Instances of lawyers including fictitious AI-generated case law in their submissions to court are already well known, but LLMs can and do go further.</p>
<p>Outputs can generate false and defamatory content that has the potential to cause a person actual reputational damage. This can even include fabricating non-existent &#8220;quotes&#8221; purportedly from newspaper articles and the like.</p>
<p>This tendency to make things up is referred to as hallucination, and some experts regard it as a problem inherent in the mismatch between the way generative AI functions and the uses to which it is put. For the time being, at least, it is a persistent feature of generative AI[2].</p>
<p>This inevitably raises the question of where legal liability rests when LLMs generate false and harmful content.</p>
<p>In the USA, much of the debate has centred around whether the creator of the LLM &#8211; such as OpenAI in the case of ChatGPT &#8211; can be held liable in light of the statutory protection afforded to the hosting of the online content of other content providers under the U.S. Code 230, although it appears that the generally held view is that Generative AI tools do not fall within the protection afforded under this law, as it generates new code and does not merely host third party content.</p>
<p>In the EU, the European Commission&#8217;s proposed AI Liability Directive, currently still in draft form, will work in conjunction with the EU AI Act and make it easier for anyone injured by AI-related products or services to bring civil liability claims against AI developers and users[3].</p>
<p>The EU AI Act, also currently in draft form, proposes the regulation of the use and development of AI through the adoption of a &#8216;risk-based&#8217; approach that imposes significant restrictions on the development and use of &#8216;high-risk&#8217; AI.</p>
<p>Although the current draft of the Act does not criminalise contravention of its provisions, the Act empowers authorised bodies to impose administrative fines of up to 20,000,000 EUR or 4% of an offending company&#8217;s total worldwide annual turnover, for non-compliance of a particular AI system with any requirements or obligations under the Act.[4]
<p>In the UK, a government White Paper on AI regulation recognises the need to consider which actors should be liable, but goes on to say that it is &#8216;too soon to make decisions about liability as it is a complex, rapidly evolving issue'[5].</p>
<p>The position in South Africa is governed by the common law pertaining to personality injury.</p>
<p>The creator of the LLM would presumably be viewed as a media defendant, meaning that a lower level of <em>animus iniuriandi &#8211; </em>namely negligence &#8211; would be required to establish a defamation claim than if the defendant were a private individual. What would constitute negligence in the case of a creator of an LLM that is known to hallucinate is an open question, which may depend on whether reasonable measures to eliminate or mitigate the known risks could have been put in place by the creator[6].</p>
<p>What is clear is that disclaimers stressing the risk that the output of the LLM will contain errors &#8211; which AI programmes often contain &#8211; would not immunise AI owners from liability, because they could at most operate as between the AI company and the user, but would not bind the defamed person.</p>
<p>But on a practical level, the potential liability of the AI creator would be of less importance to a South African plaintiff, because the creator would have to be sued in the jurisdiction where it is located (except in the unlikely event that it had assets in SA capable of attachment to found jurisdiction), rendering such claims prohibitive.</p>
<p>The potential liability of the user of the LLM, who then republishes the defamatory AI-generated output, is another matter.</p>
<p>Firstly, it is no defence to a defamation action to say that you were merely repeating someone else&#8217;s statement. Secondly, the level of <em>animus iniuriandi</em> required would depend on the identity of the defendant.</p>
<p>If the defendant was a media company &#8211; for example, an entity that uses AI to aggregate and summarise news content &#8211; then only negligence would be required, and that might consist of relying on an LLM known to hallucinate without putting the necessary steps in place to catch false and harmful output.</p>
<p>If on the other hand the defendant was a private individual using the AI to generate text, then the usual standard of intent would apply, which would obviously make a claim much harder to establish. Intent however includes recklessness.</p>
<p>It remains to be seen whether our Courts would consider it reckless to repeat a defamatory AI-generated statement in light of the caveats that AI creators have published against the use of their AI tools.</p>
<p>For example, OpenAI has provided users with a number of warnings that ChatGPT “can occasionally produce incorrect answers” and “may also occasionally produce harmful instructions or biased content”.[7]
<p>It remains to be seen what approach the Courts will adopt regarding false and defamatory AI-generated content.</p>
<p>We anticipate that in dealing with these questions, the Courts will have to engage with questions of public policy, such as balancing the competing interests of reputational rights with not imposing undue burdens on innovation and the use of new technologies.</p>
<p>As LLMs are increasingly integrated into larger platforms (e.g. search engines), so their content will be published more widely and the risk of reputational harm to individuals referred to will increase[8].</p>
<p>This area of delictual and product-related liability can be expected to develop rapidly in coming while.</p>
<hr />
<h6>Footnotes</h6>
<h6>[1] Natasha Lomas, &#8220;Who&#8217;s liable for AI-generated lies?&#8221;, 1 June 2022</h6>
<h6>[2] Matt o&#8217; Brien, &#8220;Tech experts are starting to doubt that ChatGPT and A.I. &#8216;hallucinations&#8217; will ever go away: &#8216;This isn&#8217;t fixable'&#8221;, 1 August 2023</h6>
<h6>[3] Womble Bond Dickinson, &#8220;AI and liability 2023: a guide to liability rules for Artificial Intelligence in the UK and EU&#8221;, 9 May 2023. The Directive will <em>inter alia</em> include a presumption of causality, i.e. a rebuttable presumption that the action or output of the AI <em>was</em> caused by the AI developer or user against whom the claim is filed.</h6>
<h6>[4] <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52021PC0206">Regulation of The European Parliament and of The Council</a></h6>
<h6>[6] Volokh <em>op. cit.</em> discusses the following possible precautions which may or may not be technically feasible: quote-checking; avoiding quotes altogether; double-checking output; notice-and-blocking mechanisms for reported falsehoods; and discontinuing earlier versions of the LLM when new versions prove materially more reliable. The last-mentioned is particularly interesting, because leaving an earlier and admittedly less reliable version in the market, and making it free to use, could reasonably be construed as negligent in and of itself.</h6>
<h6>[7] <a href="https://help.openai.com/en/articles/6783457-what-is-chatgpt">What is ChatGPT? </a></h6>
<h6>[8] Farrer &amp; Co, &#8220;Can ChatGPT and other generative AI tools be liable for producing inaccurate content?&#8221;, <em>per</em> Ian de Freitas and Thomas Rudkin, 6 July 2023</h6><p>The post <a href="https://werksmans.com/liability-for-defamation-by-ai/">Liability for defamation by AI</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Ruling in favour of the Digital Age: Local and foreign courts give a   to electronic agreements and signatures</title>
		<link>https://werksmans.com/ruling-in-favour-of-the-digital-age-local-and-foreign-courts-give-a-to-electronic-agreements-and-signatures/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=ruling-in-favour-of-the-digital-age-local-and-foreign-courts-give-a-to-electronic-agreements-and-signatures</link>
		
		<dc:creator><![CDATA[Preeta Bhagattjee]]></dc:creator>
		<pubDate>Wed, 02 Aug 2023 00:00:00 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://www.werksmans.online/ruling-in-favour-of-the-digital-age-local-and-foreign-courts-give-a-to-electronic-agreements-and-signatures/</guid>

					<description><![CDATA[<p>and Karabo Kekana, Candidate Attorney Recently a Canadian court decided that a emoji constituted an electronic signature and resulted in a binding and enforceable sale agreement. In RSA, a court held that a credit agreement entered into and signed electronically was enforceable despite the consumer's arguments to the contrary. Digital agreements - of various types  [...]</p>
<p>The post <a href="https://werksmans.com/ruling-in-favour-of-the-digital-age-local-and-foreign-courts-give-a-to-electronic-agreements-and-signatures/">Ruling in favour of the Digital Age: Local and foreign courts give a   to electronic agreements and signatures</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><em>and Karabo Kekana, Candidate Attorney</em></p>
<p>Recently a Canadian court decided that a   emoji constituted an electronic signature and resulted in a binding and enforceable sale agreement. In RSA, a court held that a credit agreement entered into and signed electronically was enforceable despite the consumer&#8217;s arguments to the contrary. Digital agreements &#8211; of various types &#8211; have become the norm rather than the exception. This article discusses these recent cases in the context of a digitalised world and what this means for contracting parties.</p>
<h3><strong>Our digital world  </strong></h3>
<p>Modern society is firmly embedded in the digital age. The rapid advancements in the use of artificial intelligence is evidence of this.[1] This has various practical implications for businesses and individuals alike &#8211; including in contracting practices, where an increasing number of agreements are being negotiated and concluded electronically, replacing traditional paper-based contracting. The benefit with such agreements is that parties are able to negotiate and conclude agreements remotely and with relative ease. However electronic contracting poses risks relating to contractual certainty and security.</p>
<h3><strong>The Canadian approach: <em>South Wester Terminal Ltd. v Achter Land &amp; Cattle Ltd. </em>(&#8220;<em>SWT</em>&#8220;)[2]</strong></h3>
<p>The recent <em>SWT </em>decision in Canada concerned a summary judgment application brought by South Wester Terminal Ltd. (&#8220;<strong>SWT</strong>&#8220;) against Achter Land &amp; Cattle Ltd (&#8220;<strong>Achter</strong>&#8220;). SWT and Achter had an ongoing business relationship where Achter sold flax to SWT who in turn on sells it to third parties. SWT alleged that a sale agreement for flax had been entered into as a representative of SWT had signed a sale agreement with a wet ink signature and sent a photo of it to a representative of Achter with a text message which read: &#8220;<em>please confirm flax contract</em>&#8220;. The Achter representative responded by text with a thumps up ( ) emoji. Achter never delivered the flax, and SWT alleged that the agreement had been breached and sought damages.[3]</p>
<p>The Canadian court considered whether the agreement was concluded in writing and signed by both parties: these being requirements of the Canadian laws that applied to the contract in question.[4]</p>
<p>The court held that the <strong> </strong> emoji constituted an electronic signature and thus met the legal requirement for the agreement to be signed. In making its determination the court considered that the <strong> </strong> emoji originated from Achter&#8217;s representative and his unique cell phone number and that there was no issue as regards the authenticity of the text message containing the thumbs up emoji. [5]</p>
<p>The court considered that there was a pattern between the parties that the SWT representative would text Achter a photo of a contract and ask him to confirm a contract. Achter had previously replied with &#8220;<em>looks good</em>&#8220;, &#8220;<em>ok</em>&#8221; or &#8220;<em>yup</em>&#8221; and had always proceeded to deliver the flax as agreed to per text.[6] The court held that an objective bystander would conclude that the parties had entered into an agreement in the same manner that they had previously done and therefore held the contract to be enforceable. [7]</p>
<h4><strong>ECTA and the NCA: what the legislators have to say </strong></h4>
<p>In South Africa the use of electronic agreements and signatures are governed by the Electronic Communication and Transactions Act[8] (&#8220;<strong>ECTA</strong>&#8220;) which applies to electronic transactions and data messages (information sent, received, generated or stored by electronic means).[9] ECTA allows other Acts, such as the National Credit Act[10] (&#8220;<strong>NCA</strong>&#8220;), to have their own provisions and requirements for data messages (including electronic agreements).</p>
<p>ECTA governs the use of &#8220;<em>electronic signatures</em>&#8221; and distinguishes between two types &#8211;</p>
<ul>
<li>an &#8220;<strong><em>Electronic Signature</em></strong>&#8221; which comprises data intended by the user to be a signature. Examples include a digitally drawn signature, a scanned image of a signature or a digital signature produced by an application (&#8220;<strong>Ordinary Electronic Signature</strong>&#8220;); and</li>
<li>an &#8220;<strong><em>Advanced Electronic Signature</em></strong>&#8221; which is an electronic signature which results from a process which has been accredited by the South African Accreditation Authority (or SAAA).[11] Accredit. A person has to go through a face to face authentication process to make use of an Advanced Electronic Signature, they are therefore much less common than Ordinary Electronic Signatures.<sup> [12]</sup></li>
</ul>
<p>ECTA provides that where a law specifically requires the signature of a person and that law does not specify the type of signature, an Advanced Electronic Signature rather than an Ordinary Electronic Signature must be used. Where the parties have not agreed to a specific type of electronic signature (and the law does not require one) an Ordinary Electronic Signature may be used and the method used must be reliable and identify the signatory.[13]</p>
<p>Section 2(3) of the NCA provides that where the NCA requires a document to be signed by a party to a credit agreement, an Ordinary Electronic Signature or Advanced Electronic Signature may be used <u>but</u> where an Ordinary Electronic Signature is used it must be applied by each party in the physical presence of the other. Nowhere in the NCA or its regulations[14] is it said that a credit agreement has to be signed. On the one hand, the regulations contain a form which small credit agreements must comply with, requiring a signature;[15] on the other hand, the regulations also permit other types of credit agreements which do not require a signature, namely telephonic and electronic agreements[16] It is thus a grey area whether the NCA actually requires a credit agreement to be signed to be valid but in practice credit providers tend to err on the side of caution by requiring some form of signature for written credit agreements, whether it be wet ink or electronic.</p>
<h3><strong><em>Firstrand Bank Limited t/a Wesbank v Govender</em></strong><strong> (&#8220;<em>Govender</em>&#8220;):[17] what the court said </strong></h3>
<p>In <em>Govender </em>a consumer defaulted on payments in terms of an instalment sale agreement for a car. The consumer&#8217;s defence to the credit provider&#8217;s claim for payment was that he never signed the &#8220;<strong><em>I-contract</em></strong>&#8221; in question and that the agreement (entered into and signed electronically) was invalid and unenforceable as it failed to comply with ECTA&#8217;s electronic signature requirements. How the contracting process works is that following a consumer selecting a vehicle at a dealership, they register their details and receive a one-time pin (&#8220;<strong>OTP</strong>&#8220;) which the consumer enters to allow them to access and sign the I-contract electronically. The credit provider&#8217;s watermark stamp on each page evidenced that the consumer signed the contract electronically. The consumer is also required to produce their identity documents to verify their identity.[18] This signature process does not constitute an Advanced Electronic Signature but rather an Ordinary Electronic Signature.</p>
<p>The court in <em>Govender</em> held that the ECTA settled any uncertainty relating to electronic agreements and that data messages and electronic signatures were equivalent to any paper form signature, and that the primary question was whether the requirements for a valid agreement had been met.[19] The court stated that the I-contract entered into was a valid and enforceable contract based on the evidence presented by the parties.[20]</p>
<p>The court in <em>Govender </em>did not specifically deal with the NCA&#8217;s requirements for face to face authentication of an Ordinary Electronic Signature, nevertheless there is an argument to be made that those requirements would have been met given that the consumer was authenticated in person when the identity documents were provided to the credit provider. Furthermore, the High Court has previously accepted a signature to an I-contract credit agreement as valid for purposes of the NCA.[21]</p>
<h3><strong>Risky business: the downside of electronic agreements and signatures </strong></h3>
<p>The <em>Govender case </em>highlights that although Ordinary Electronic Signatures are convenient and efficient, there are security and authenticity risks associated with them. It can be difficult to verify whether an Ordinary Electronic Signature is genuine or forged and they are also vulnerable to hacking, leading to the possibility of fraud and identity theft. However, the risk with using Ordinary Electronic Signatures depends on the type of electronic signature mechanism employed, with some forms of electronic signatures being more secure and therefore more reliable than others.</p>
<h3><strong>The use of smart contracts </strong></h3>
<p>A smart contract is a self-executing protocol stored on a blockchain. Smart contracts are transparent, verifiable and immutable given that they cannot be tampered with: once a smart contract has been deployed on the blockchain, it cannot be changed. They therefore increase efficiency, trust and security in the contracting process. Further, the code of the contract can be audited by anyone, promoting transparency.</p>
<p>Smart contracts are recognised as electronic agreements in terms of ECTA. Although the signatures generally applied to a smart contract would constitute an Ordinary Electronic Signature (and not an Advanced Electronic Signature), given the security which the blockchain mechanism provides and the auditability of the process, the contract and electronic signature would offer more security than other forms of Ordinary Electronic Signatures. The process also promotes certainty of contracting.</p>
<h3><strong>Conclusion: where to from here? </strong></h3>
<p>The <em>Govender </em>decision reaffirms our Courts&#8217; approach to accepting electronic agreements and signatures, indicating that our courts are not willing to invalidate electronic acts purely on the basis that they are electronic and not paper based. The <em>SWT </em>decision indicates the risk of using informal electronic channels to negotiate contracts. Businesses and consumers alike should ensure that they use electronic contracting and signature mechanisms that are sufficiently secure and robust to provide the requisite legal certainty and security. Businesses could consider ensuring enhanced security and authentication by using smart contracts in appropriate applications.</p>
<p>&nbsp;</p>
<hr />
<h6>Footnotes</h6>
<h6>[1] See the LegalWerks article by the co-author, Preeta Bhagattjee, on <a href="https://werksmans.com/legal-updates-and-opinions/generative-ai-its-magic-but-fraught-with-legal-risks/">Generative AI and its risks</a></h6>
<h6>[2] 2023 SKKB 116 (heard by the court of the King&#8217;s Bench for Saskatchewan, a province in Canada).</h6>
<h6>[3] See paragraph 15 of the judgment for a summary of the facts.</h6>
<h6>[4] <em>The Sales of Goods Act, </em>RSS 1978, c S-1 (in terms of section 6(1) which provided that the Act applied to contracts of sale for $50 upwards) and <em>The Electronic Information and Documents Act, 2000, SS 2000, c E-7.222 </em>were held by the court to be applicable to the flax sale contract.</h6>
<h6>[5] <em>SWT</em>, paragraphs 60 to 64.</h6>
<h6>[6] <em>SWT</em>, paragraph 21.</h6>
<h6>[7] <em>SWT</em>, paragraph 36.</h6>
<h6>[8] No 25 of 2002.</h6>
<h6>[9] ECTA<em>, </em>section 1.</h6>
<h6>[10] No 34 of 2005.</h6>
<h6>[11] Currently only LawTrust and the South African Post Office are accredited.</h6>
<h6>[12] See section 1 of the ECTA for these definitions.</h6>
<h6>[13] See section 13 of the ECTA.</h6>
<h6>[14] <em>GNR.489 of 31 May 2006: Regulations made in terms of the National Credit Act, 2005 </em>(the &#8220;<strong>regulations</strong>&#8220;)</h6>
<h6>[15] Small credit agreements must comply with Form 20.2 which requires a signature (regulation 30(2) of the NCA regulations).</h6>
<h6>[16] The regulations provide that transcripts are sufficient in this regard provided that the consumer is supplied with a copy (see regulation 30(3) of the NCA regulations for small credit agreements and regulation 31(4) for large and intermediate credit agreements).</h6>
<h6>[17] (2021/25131) [2023] ZAGPJHC 610 1 June 2023. (Note that the neutral citation refers to &#8220;Govendor&#8221; and not &#8220;Govender&#8221;, we use the latter spelling in this article in line with the spelling used in the judgment).</h6>
<h6>[18] See paragraphs 10 to 12 of the judgment for a description of the process.</h6>
<h6>[19] <em>Govender</em>, paragraphs 24 to 27.</h6>
<h6>[20] <em>Govende</em>r, paragraph 32.</h6>
<h6>[21] See also the decision of <em>First Rand Bank t/a Wesbank v Molamugae</em> (24558/2016) [2018] ZAGPPHC 762 (26 February 2018) where the court held that an I-contract for an instalment sale agreement had been validly entered into and signed electronically with reference to the NCA&#8217;s requirements.</h6>
<p>The post <a href="https://werksmans.com/ruling-in-favour-of-the-digital-age-local-and-foreign-courts-give-a-to-electronic-agreements-and-signatures/">Ruling in favour of the Digital Age: Local and foreign courts give a   to electronic agreements and signatures</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Generative AI: It’s magic but fraught with legal risks</title>
		<link>https://werksmans.com/generative-ai-its-magic-but-fraught-with-legal-risks/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=generative-ai-its-magic-but-fraught-with-legal-risks</link>
		
		<dc:creator><![CDATA[Preeta Bhagattjee]]></dc:creator>
		<pubDate>Tue, 06 Jun 2023 00:00:00 +0000</pubDate>
				<category><![CDATA[Legal updates and opinions]]></category>
		<category><![CDATA[Technology & Innovation]]></category>
		<guid isPermaLink="false">https://www.werksmans.online/generative-ai-its-magic-but-fraught-with-legal-risks/</guid>

					<description><![CDATA[<p>and Hlonelwa Lutuli, Candidate Attorney The use and beneficial application of generative AI in the workplace is increasing at an exponential rate - with many businesses actively developing and adapting AI or AI being used organically within businesses without clear guidelines for use being in place. While no AI-specific law or regulation has been passed  [...]</p>
<p>The post <a href="https://werksmans.com/generative-ai-its-magic-but-fraught-with-legal-risks/">Generative AI: It’s magic but fraught with legal risks</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><em>and Hlonelwa Lutuli, Candidate Attorney</em></p>
<p>The use and beneficial application of generative AI in the workplace is increasing at an exponential rate &#8211; with many businesses actively developing and adapting AI or AI being used organically within businesses without clear guidelines for use being in place.</p>
<p>While no AI-specific law or regulation has been passed in South Africa as yet, navigating the significant legal risks which AI adoption poses has become crucial. Key legal considerations should be front of mind for effectively and uniformly managing both the use of, and development of, generative AI tools within a business:</p>
<ul>
<li><strong>Confidentiality risks:</strong>
<ul>
<li>Information entered into prompts on generative AI systems will not remain confidential and may be shared with third parties for review purposes. Generative AI may use the data provided by a user to train and improve its AI model. For example, one of the ways in which OpenAI improves ChatGPT is by training the AI on the conversations people have with it (unless users opt out of permitting the use of their data for such AI training purposes or to improve their models or where the enterprise API is used (where a user has to specifically opt-in to share data)). Consequently, exposing business or third party confidential or proprietary information in prompts may breach contractual or statutory confidentiality obligations and compromise company trade secrets.</li>
</ul>
</li>
</ul>
<ul>
<li><strong>Data Protection considerations:</strong>
<ul>
<li>The inputting of and reuse, access to, sharing and further processing of identifiable personal information in AI systems could result in processing identifiable personal information that could fall foul of the <a href="https://www.gov.za/documents/protection-personal-information-act#:~:text=to%20provide%20for%20the%20rights,provide%20for%20matters%20connected%20therewith." target="_blank" rel="noopener">Protection of Personal Information Act of 2013.</a></li>
<li>Data entered into prompts may be transferred across borders (including in the case of OpenAI whose servers are based outside of South Africa) where such transfers are subject to specific limitations in terms of data protection law.</li>
<li>More stringent legislative rules generally apply to the processing of sensitive or special personal information, such as health data and biometric data (e.g. as used in facial recognition technology) and any collection, use and sharing of such information should be evaluated to address any privacy risks.</li>
<li>The use of algorithms which undertake automated decision-making tasks should also be interrogated to ensure compliance with data protection laws.</li>
</ul>
</li>
</ul>
<ul>
<li><strong>Competition law considerations:</strong>
<ul>
<li>Accidentally or deliberately accessing company business information, trade secrets, confidential information or other competitively sensitive information of competitors using generative AI or sharing your own business information on a public AI platform could have anti-competitive implications as this information could be used to predict competitor behaviour or to adjust or co-ordinate pricing so as to enable competitors to indirectly or directly participate in price fixing or collusive tendering. Even if some form of price fixing or collusive tendering does not occur, possession and awareness of a competitor&#8217;s competitively sensitive information is, in certain circumstances, regarded by the Competition Commission as a contravention of the Competition Act 89 of 1998 or as indicative of an underlying anti-competitive arrangement.</li>
</ul>
</li>
</ul>
<ul>
<li><strong>Intellectual Property:</strong>
<ul>
<li>The use of generated content from generative AI may constitute copyright infringement in terms of the Copyright Act of 1978 on two grounds: (i) where the training datasets that the generative AI tool has been fed or trained on includes copyrighted works that neither the generative AI owner nor the user have a licence to use; or (ii) the generative AI tool produces responses or generates works that are similar to existing and protected works or replicates an existing work that is protected under copyright or other intellectual property laws.</li>
<li>In respect of the intellectual property rights in new works created by using AI, the Copyright Act may vest authorship of the AI-generated content in either the user who inputs the prompt or the creator of the AI-tool. Even though the Open AI terms of use specifically assign intellectual property rights in generated content to the user, companies should consider appropriate mechanisms to address the risk that works generated by their employees&#8217; using generative AI tools may not vest in the company. Businesses may not be able to enforce intellectual property rights against third party use of such generated works.</li>
</ul>
</li>
</ul>
<ul>
<li><strong>Exposing proprietary source code on generative AI systems or using generative AI to develop proprietary code:</strong>
<ul>
<li>A company&#8217;s proprietary computer code which is made accessible to a generative AI system could be exposed to the public and result in infringement of its intellectual property rights. Significant security risks may also result from the exposure of the source code on an AI system. Another consideration is that such code may be subsumed into open source software.</li>
<li>Where a company uses generative AI to develop computer code, the new work may be based on a third party&#8217;s intellectual property rights where third party proprietary code is incorporated into the generated result. Such generated code may not meet compliance and/or industry standards for mitigating against vulnerabilities in such code and meeting minimum security standards.</li>
</ul>
</li>
</ul>
<ul>
<li><strong>Incorrect or discriminatory information:</strong>
<ul>
<li>As generative AI is trained on data which may contain incorrect information or reflect biases or offensive content, there is a risk that the AI tool outputs may be false and/or could be considered discriminatory or offensive. Distribution of such content within the workplace may have implications under the Employment Equity Act and Labour Relations Act. The distribution of incorrect or offensive or discriminatory content outside of the business could give rise to potential civil and delictual liability.</li>
</ul>
</li>
</ul>
<ul>
<li><strong>Reputational risk:</strong>
<ul>
<li>A company&#8217;s inability to avoid or mitigate against the legal risks highlighted above has the residual impact of causing lasting damage to the organisation&#8217;s reputation. The basis of this lies in the fact that a failure to mitigate against these risks may leave a company&#8217;s business partners, clients and trade secrets at risk. Further, with the boom of AI-detection software in the market, companies may also need to consider the reputational risk of using content that can easily be identified as AI-generated without labelling the content as such.</li>
</ul>
</li>
</ul>
<p>Interesting Read &#8211; <a href="https://werksmans.com/legal-updates-and-opinions/raging-against-the-machine-the-rise-of-artificial-intelligence-in-the-workplace/">Raging against the machine</a></p>
<p>Innovation and digitalisation strategies of many businesses increasingly place reliance on AI tools and not incorporating some form of generative AI tools into a business&#8217; operations may hinder the business&#8217; competitive edge against competitors who are effectively and successfully using AI to optimise their operations. Therefore, the key to successfully managing AI risk and navigating any legal and reputational risks lies in the adoption of appropriate rules and policy guidelines for the consistent and responsible use of AI within a company&#8217;s operations, including by implementing appropriate internal measures to mitigate against such risks.</p>
<p>The post <a href="https://werksmans.com/generative-ai-its-magic-but-fraught-with-legal-risks/">Generative AI: It’s magic but fraught with legal risks</a> appeared first on <a href="https://werksmans.com">Werksmans Attorneys</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
