How to make custom OCR recognize decimals
1 view (last 30 days)
I've got a running program to read handwriting. It's all the basic stuff and I'm using the EMNIST dataset. I encountered a problem when I tried reading a line with numbers and decimals. I don't really know how to make it so my program reads the decimal points without making it think that any dot is a decimal.