Great to hear from you again! We use "Coefficient of Concordance" as a measure of the model performance - really a way to express how well a model discriminates between "positives" and "negatives". Many different model performance measures exist (e.g. accuracy, KS etc etc) but we use CoC, which really is just an alias for what is commonly referred to as "AUC" (Area Under the Curve). In future releases we will start using AUC instead of CoC.
Whatever the name - this is a performance measure that deals well with imbalanced binary outcomes (e.g. many more "offered" than "accepted"), which is common for the use cases of our customers.