Dimensions is a linked data platform launched in January 2018 by Digital Science. Data is brought together to create a linked research discovery tool. Dimensions includes the Digital Science suite of products (Altmetric, Consultancy, Figshare, ReadCube, Symplectic (Minerva Elements) and UberResearch), a large number of external partners and publishers.
Dimensions includes citation data, research analytics features and a way of filtering through scholarly content such as publications, patents clinical trials aggregated for various registries, and grant information.
Digital Science created an integrated database covering the research process from funding to research, publishing to attention, as well as commercial application and policy making.
This platform for research evaluation is still evolving. This guide will help with learning how you can start to navigate and use Dimensions.
The university subscribes to Dimensions Plus (a more enhanced version of Dimensions) which offers more than the standard, free version available to the public. Individual registration is required to use all the features of this database. Enter your unimelb.edu.au email address at the login page and follow the prompts to create an account. A verification email should arrive within minutes. If it does not, it may be on hold in your Mimecast Personal Portal. More information on how to check and release messages from your portal is available here.
Analytical Level access is available to staff on application. More information is available on the staff hub page.
Dimensions is extracting references between publications either from existing databases (such as Crossref, PubMed Central, or OpenCitations data), or directly from the full text record provided by the content publisher. Reference extraction is not limited to journal items, but also includes citations from and to books, conference proceedings, and pre-prints.
Beyond publications, Dimensions extracts references and links to grants, clinical trials and patents to allow users to gain a superior understanding of the context of a piece of research by eliminating the walls and separations between isolated data silos.
Dimensions includes data from a huge number of sources. The data is converted to a common data model, cleaned, and then enriched so it is ready for use. The enrichment steps include disambiguation of people (“Researchers”) and Organizations, and categorizing the data into topics (“Categories”).
As with many databases dealing with research information, there may often be a bias towards STEM subjects in the data, with less information for the arts and humanities. This is partly because of the relative volume of research activity in these areas, and partly due to the availability of data across different subjects. Coverage of arts, humanities and social sciences topics is being increased.
The publication and citation content in Dimensions is aggregated in a 2-step process.
The robust metadata 'backbone' is shaped from sources such as PubMed, PubMed Central, ArXiv and CrossRef. Currently this results in more than 90 million records. CrossRef records with DOI (provided by the contributions of CrossRef members) forms a large core of these.
Through powerful mining of the full-text records and acknowledgement sections of the underlaying records, the metadata can be enhanced and links created that would otherwise be impossible to generate.
This makes Dimensions a unique resource, different to any other citation index such as Scopus or Web of Science.
Uniquely computed research metrics are made possible due to the way the diverse Dimensions source data gets aggregated and integrated.
The Relative Citation Ratio (RCR) indicates the relative citation performance of an article, when compared to other articles in its area of research. The RCR is normalized to 1.0 and calculated for all articles funded by the NIH in the Dimensions catalog. An RCR of more than 1.0 shows that a publication has an above average citation rate for its group, when defined by the subject area citation rates of the articles that have been cited with it.
Articles that are less than 2 years old, or do not have citations, do not have an RCR.
The Field Citation Ratio (FCR) indicates the relative citation performance of an article, when compared to similarly-aged articles in its subject area. The FCR is normalized to 1.0 for this selection of articles. An FCR value of more than 1.0 shows that the publication has a higher than average number of citations for its group (defined by its FoR Subject Code, publishing year, and age).
Articles that are less than 2 years old do not have an FCR. An article with zero citations has an FCR of 0.
“Comparison of two article-level, field-independent citation metrics: Field-Weighted Citation Impact (FWCI) and Relative Citation Ratio (RCR)”: http://ssrn.com/abstract=3237564