Every year Conacyt opens a new call to the National System of Researchers (SNI). Last year there were 15,520 applicants and surely in 2022 the number will be similar or higher. In 2021, 7,767 applied for admission and 47% were accepted. Of the 9,038 who needed to renew that year, as indicated by Conacyt in statement 276/2022, only 7,753 did so and 75% of them stayed. In other words, 39% of those who applied in that call were left out and 1,285 did not even request a renewal. For those who are or have been in the SNI, they know what it means to update the Unique Curriculum Vitae (CVU) when the call for entry, permanence or promotion to the SNI is opened and somehow they can remember the time they spent to complete and submit your request “to the system”.
Some will remember the times when you had to carry a complete Curriculum Vitae paper copy. The SNI started in 1984, at the same time as the digital era, but for two decades it was an exclusively paper-based institution. The image of the long lines outside the SNI facilities to submit the application that had been filled out with a typewriter, as well as with the photocopies of the receipts, was a classic. Attendance was in strict alphabetical order. Surely the number of boxes correlated with the level of the applicant. Some time later, the interested party received a message to pick up the material that had been reviewed. Although the Conacyt of the last century did not keep data electronically, those in charge of the SNI organized the files so that the committees could review them and issue their opinions.
In October 2005, Conacyt established the policy that all those researchers who wished to carry out any procedure or request to the Conacyt programs, should fill out the new electronic format of the CVU located on the portal www.conacyt.mx. have a username and password and, in exchange, the electronic program grants an electronic identifier that allowed each person to be distinguished for the rest of their lives, thereby avoiding duplication or loss. As of that date, the calls to the SNI inaugurated a time of hybrid applications. Those interested in attending the call should fill out the CVU electronically and should also send a package per applicant addressed to the SNI by specialized courier. As of 2015, the sending of paper documentation disappears and the supporting documents of the academic production included in the CV must be sent on a compact disc (CD) to Conacyt. Fortunately, the digital era ended up dominating scientific paperwork and currently all the registration and documentation of calls is done electronically.
Two issues have always intrigued me: a) What is the use of having an electronic database of at least 17 years with the CVUs of all those who submitted their data if the applicants have to fill out forms again? And b) how many hours of accumulated social work are put in that database? It does not end up being clear if it is the technology that is serving the community or the community that serves the available technology. The latter is counter intuitive.
From 2021 to 2022 the number of national researchers increased to 36,714. Using different official reports, graph 1 shows that in 1984, 1,386 national researchers were registered for the first time and that the growth to 2022 is exponential. An annual change of 8.6% is recorded in the period of four decades, but with variations of 14.4% in the first and 6.8% in the last. But beyond talking about the number or the dynamics of growth, I want to refer to the personal data accumulated throughout this period. I can’t imagine the size of the data repository that Conacyt has or should have, but for accumulated years and the number of applicants.
What is done with this CVU database beyond using it to grant incentives to enter the SNI or not? Why is your access not made public? According to Julia Lane , the Brazilian experience with the Lattes database (http://lattes.cnpq.br/english) is a powerful example of good practices to improve the quality of science. In Brazil, at the end of the 1990s, there were 1.6 million researchers and around 4,000 institutions. Back then, Brazil’s national research funding agency recognized that it needed a new approach to assessing researchers’ credentials. First, it developed a “virtual community” of federal agencies and researchers to design and develop the Lattes Electronic Infrastructure. Second, it created appropriate incentives for researchers and academic institutions to use the database: the federal agency refers to the data when making funding decisions, and the universities when deciding tenure and promotion. Third, it established a unique investigator identification system to ensure that people with similar names are properly accredited. The result is one of the cleanest investigator databases out there.
In Mexico, these types of practices should be studied and discussed. Systematizing access to Conacyt’s databases should be a national priority. The history, publications, funding, teaching, thesis supervision, attendance at conferences of thousands and thousands of researchers from the nine areas that make up the SNI are contained there, but above all there are the research topics that they give meaning to the science that is practiced in the country, regardless of the discipline, as well as the missing ones. Science policy not only needs regulations, it also requires an in-depth and detailed analysis of what is done, who does it and in what institutions.
To answer this, it is not enough “General Reports on the State of Science, Technology and Innovation” that are published year after year or the databases of “current researchers”, which have been published since 2015 with a small and inconsistent number of variables. You have to be more creative, as Julia Lane says: …in addition to building an open and consistent data infrastructure, there is the additional challenge of deciding what data to collect and how to use it. This is not trivial. Knowledge creation is a complex process, so perhaps alternative measures of creativity and productivity should be included in scientific metrics, such as filing patents, prototyping, and even producing YouTube videos. Many of these are more up-to-date measures of activity than counting publications and citations… As is well known, the transmission of knowledge differs from one field to another and we cannot continue to use traditional bibliometric approaches and generalize it to all disciplines.
We have been more than half a century since the first citation indexing attempts were made and almost 4 decades since the SIN was founded; we need to shake off that old-fashioned evaluation mantra. It would be great to create more reliable, more transparent and more flexible scientific performance metrics involving stakeholders. It would be good to analyze the effect of changes in incentives –both in financing and in evaluation- and for that we need economic theory. With a large amount of data available on scientific interactions thanks to the Internet and recognizing that more and more people are committed to the scientific development of science measurement, it is necessary to act with a vision of the future. Beyond identifying and pigeonholing national researchers, we need to capture the essence of what it means to be a good scientist in Mexico and promote good practices. That is probably why we gather so much data year after year.
*The author is a professor at the University of Washington in the Department of Health Measurement Sciences and the Institute for Health Metrics and Evaluation and a member of the SNI level III https://www.healthdata.org/about/rafael-lozano
This is Auto Posted article collected article from different sources of internet, EOS doesn’t take any responsibilities of this article. If you found something wrong in this article, please tell us.