Just published in the International In-house Counsel Journal! Full text here. Print edition available in April 2018.
People analytics – the focusing of so-called big data on human resources – is a tool for finding those factors that correlate with workplace success. Done right, people analytics can be extraordinarily powerful tools for testing, understanding and advancing the practice of human resources. Done incorrectly, and there is a potential to bake bias right into the core of your human resources activities. The story told about the future of people analytics is the stuff of utopian writing – that computers will take massive amounts of data (say, the history of your company’s hiring to date and the success or lack of success of candidates) – and derive from that data a way to select the best candidates for the job. And because these processes are based on data, math and formulas, we are led to believe that the results will be free from the usual bias associated with hiring. Put more succinctly, people analytics should help us learn from the past to determine who will be more likely to succeed and who will be more likely to fail fail. Unfortunately, the truth is quite different. With smart lawyering, however, we can create better systems and help our organizations better understand (and thus manage) the risk associated with people analytics.
AI Now, comprised of researchers from Google Open Research, Microsoft Research, and New York University, issued a list of top 10 Recommendations for AI. Worth the read. Here are two recommendations of note:
2 — Before releasing an AI system, companies should run rigorous pre-release trials to ensure that they will not amplify biases and errors due to any issues with the training data, algorithms, or other elements of system design. As this is a rapidly changing field, the methods and assumptions by which such testing is conducted, along with the results, should be openly documented and publicly available, with clear versioning to accommodate updates and new findings.
4 — More research and policy making is needed on the use of AI systems in workplace management and monitoring, including hiring and HR. This research will complement the existing focus on worker replacement via automation. Specific attention should be given to the potential impact on labor rights and practices, and should focus especially on the potential for behavioral manipulation and the unintended reinforcement of bias in hiring and promotion.
A worthwhile read.
Often overlooked, payroll professionals play a key role in the processing and protection of key personnel information. Here is the deck of a presentation to the NYS Payroll Association, offering them a global perspective on privacy and cybersecurity.
I am pleased to announce that my paper, tentatively titled “Countering Bias in Expert HR Systems: A Guide for In-House Counsel,” has been accepted for publication in the International In-house Counsel Journal.
The paper will present a very user-friendly guide to understanding and managing the risk from expert HR systems. As I’ve argued in this blog, management-side employment counsel must get deep under the hood of expert systems designed to perform evaluative functions on candidates and employees. From procurement to deployment, counsel must be equipped to understand the bias that will likely (unintentionally) creep into algorithmic decision-making and to manage the risk of such bias.
I’ve written extensively on Algorithmic Bias and the role that employment lawyers will have to play in countering it. A recent paper published in Science shows that bias empirically.
Continue reading “Still More on Algorithmic Bias”
While many in the nonprofit community believe that a privacy and cybersecurity program is beyond their means, the fact is there are many ways to tackle this problem—many of which are low and no cost—and most of which is low-tech. And the cost of doing nothing is very high. In the highly competitive world of nonprofit reputation management, the consequences of a breach can be absolutely devastating.
I enjoyed presenting on this subject to a lively and engaged crowd at the NTEN Nonprofit Technology Conference with my colleague Raf Portnoy.
- Session details here
- Slides here
- Participant’s notes from our presentation are here.
Employers in the US likely cannot pay their employees in virtual currencies (VC) such as Bitcoin or Ethereum. For employers who are still interested (or not fully persuaded by my line of reasoning), I offer some liability minimizing strategies, below.
(Post updated 4/3/2017)
Continue reading “Paying Employees in Bitcoin?”
The payroll office – which combines the most sensitive employee information and the ability to cause money transfers – is where the “rubber hits the road” for both cyber security and its close cousin, privacy. Managing security and privacy risk – and interfacing with information security experts – is (and should be) increasingly part of the payroll professional’s job duties. In short: payroll professionals should be a part of the cybersecurity planning process.
Here is the presentation that I recently presented at the annual meeting of the NY Metro Area chapter of the American Payroll Association.