Jessica Leong, CEO of insurance-focused data analytics firm Octagram, recently addressed some of the major concerns about bias in the insurance space during an industry panel discussion.
Leong specifically noted that much of the bias is inadvertent, and may be based on training data that has bias or accidental bias built into the structure of actuarial models, etc. You can see an example of this kind of inherent bias simply by googling ‘nurse’, which will likely return a disproportionate number of women for example, then googling ‘computer programmer’, which will probably return a disproportionate number of images of men.
There are also several complex issues that society as a whole must grapple with to determine if and when certain biases should be built-in to the models. For example, if the data indicates that women are safer drivers who incur less damage as a result of driving and accidents, should women get a discounted rate as a result? If certain neighborhoods experience less property damage, should their homeowners insurance expenses come down as a result?
You can read more about these topics here.