How to optimize R&D resources
“The challenge for all researchers is to optimize limited resources,” said Dr. Malcolm Moore, JMP European Technical Manager at SAS, during the masterclass “How to Innovate Faster with Data Analytics” about the JMP system (pronounced as ‘jump’).
One of the challenging things for researchers is to reduce the number of repeated experiments and still gain the required information to optimize chemical processes. To realize this, statistic methods have been developed during the twentieth century by statisticians like Ronald Fisher (Factorial Design), Frank Yates (Fractional Factorial), George Box (Response Surface Method), and Bradley Jones (Optimal (Custom) and Definitive Designs). Initially, these methods were developed to increase agricultural yields, later for other purposes, including chemistry.
JMP is a supporting tool for DOE, Design of Experiments. The system can be fed with available data from single-factor runs, e.g. yields in function of temperature. Data mining techniques can narrow down the number of potential factors to relevant factors. JMP defines the runs to establish the most decisive factor(s). The number of runs is typically two times the number of relevant factors plus one, which is (far) less than researchers would do without this tool.
Moore explained a concrete case, with 35 potential factors. Data mining narrowed this number down to ten. Based on that outcome, the JMP system defined (2 x 10 + 1 =) 21 runs, enough to establish that in this case the optimization of temperature was crucial for the maximum yield of the chemical reaction.
To finish the case: the startup concerned (Novomer) sold the process to Saudi Aramco in 2016 for $100 million. Read here the press release.
The masterclass was hosted by SABIC, Teena Bonizzi, and was made public for the whole campus community.
Click here to learn more.