About the Reviewers
Matthew Brooks has spent 11 years in the Navy during which he became a SEAL. He left the Navy in 2003 and began working in the IT industry. He worked with the Center for SEAL and SWCC as the Operations Research Assistant, where he provided analytical support for many different career related issues for active duty SEALs and Special Warfare Combatant Crewmen (SWCC). He worked on different problems that range from simulating effects of policy on manpower distribution, assessment and selection of SEAL candidates, analyzing contributors to pressure on the force, and enlisted advancement by developing and maintaining the Alternative Final Multiple Score (AFMS) for SEALs and SWCCs.
Fabrice Leroy is a Principal Consultant in IBM Business Analytics Software Group. He has over 15 years of international experience applying advanced analytics to help organizations to solve their business problems.
He is a specialist in designing and implementing large scale data mining applications; he is also recognized as a world leading expert with IBM SPSS Modeler.
Robert Nisbet has a Ph.D. and is a consulting data scientist to IBM and Aviana Global, where he focuses on CRM modeling solution development. He recently built a churn model for a major bank in Texas, using the IBM Modeler package. He was trained initially in Ecology and Ecosystems Analysis. He has over 30 years of experience in analysis and modeling of complex ecosystems in Academia (UC, Santa Barbara) and in many large companies. He led the team at NCR Corporation in the development of the first commercial data mining solutions for telecommunications (Churn and Propensity to Buy). Currently, he is an instructor in the UC Irvine Predictive Analytics Certification Program.
He is the lead author of Handbook of Statistical Analysis and Data Mining Applications (Academic Press, 2009), and a co-author of Practical Text Mining (Academic Press, 2012). He serves as a general editor for a new book, Predictive Analytics in Medicine and Healthcare, under contract with Academic Press (Elsevier Publ.) for publication in 2014. His next book will cover the subject of Effective Data Preparation, coauthored with Keith McCormick.
David Young Oh is a practicing clinical mental health counselor with a continued interest in psychological research and statistics. His previous research on moral engagement and international perspectives on peace and war has resulted in several books and journal publications. Most recently, he has worked on International Handbook of War, Torture and Terrorism and State Violence and the Right to Peace: An International Survey of the Views of Ordinary People. He has completed his clinical internship and MS in Mental Health Counseling with the Johns Hopkins University and his BA and MA at Boston University. He currently lives and practices in Raleigh-Durham, North Carolina with his partner, dog, and chickens.
Jesus Salcedo is the QueBIT's Director of Advanced Analytics Training. Previously, he worked for IBM SPSS as the SPSS Curriculum Team Lead and as a Senior Education Specialist. Jesus was a college professor and worked at Montefiore Medical Center within the department of psychology. He has been using SPSS products for two decades. He has written numerous SPSS training courses and has trained thousands of users in both SPSS Statistics and SPSS Modeler. He received a Ph.D. in Psychometrics from Fordham University.
Terry Taerum is an analyst who has been fortunate to be in the Intel's data mining business for more than 25 years. The focus has to be on growing a profitable and sustainable network of information and idea exchange. To do this, we need good data, great analytical tools, a deep understanding of the subject matter, and a long-term commitment to continuously improve. No one can do this all on their own and it requires team effort and a partnership between all vested parties.
His college years at the University of Calgary, where he earned a doctorate, were spent primarily working on a timeshare PDP/8 and earning money as a Statistical Consultant. Inspite of changes in the speed of technology, the problems remain much the same except on a much grander scale. The problem continues to be finding better ways to maximize profit, whether measured as dollars, bushels of wheat, or happiness. The solutions are, however, much more interesting these days—pulling in resources from all around the world, using recording and digitizing processes rarely imagined in the past, and creating new and exciting means to increase all kinds of return on investment.
More recently, he has been part of larger teams prescribing actions intended to increase sales, identifying people most likely involved in fraud or transporting illegal property, providing post hoc analysis of merchandising efforts, and modeling early detection of faults in the manufacturing of electronic goods and other processes. He was one of the first users of IBM/SPSS Modeler in North America (13 years ago, previously called Clementine), when it was best known for its use of neural net. In all of these endeavors, the focus has been on growing the network of information in order to make the business processes sustainable and more profitable.