I would do some SQL LIKE statements or REGEX work, but my problem is how to compare a single feature against every value in entire database field.
For example. I have 150 address records in my billing dataset. I need to compare that against a master database that has over 100,000 address entries. The problem is that the data is not consistent in either of the sources and normalizing the data is not an option. My billing data may have an address of “150 Main St” and there is a record in the master database that has “Parkview Apartments 150 Main St”. I want to take each of the 100 input values for address and compare that to all 100,000 entries in the master database to generate a list of matched records where the master database “CONTAINS” the value of my input billing address.
The only way I could figure out to make every one of my input records compare against each of the 100,000 records in the master database was to build a massive matrix combining each of my address record billing records with each of the records in the master database and then just brute force through the process with a conditional TestFilter..