3) 60 years later, the opportunity is to create new intellect-augmenting tools that harness the power of 3 new technologies: a) public cloud, b) new sensors/big data/ML, and c) numerical (exasal) simulation as a powerful complement to experiment. What might these new tools be?
4) For example: vast datasets configured for interactive and collaborative analysis; a global knowledge graph linking all pubs, data, models; constantly updating digital twins of complex systems like the human body, climate, economy, manufacturing plants; and much more
5) We can create science services to democratize access to powerful analysis capabilities. E.g.: MG-RAST allows any scientist to upload a metagenomic dataset to obtain sophisticated analysis results--something otherwise impossible for many scientists mg-rast.org
6) We can also leverage public cloud to create powerful services to power data- and compute-rich distributed science. @globus is an example. Hosted on @awscloud, linking storage and data at 10000s of locations. What other services can we imagine?
7) We point out in the white paper that realizing a fully integrated "National Discovery Cloud" will require development in many areas, and addressing important issues of stability vs. innovation, incentives for creation and operation of new services, and much more.
8) I noted in summary: when using cloud for science, look beyond "public cloud as elastic computer"--that short changes it. It's a powerful platform for delivering new digital services. Our task is to work out what those services should be.
9) These are questions that I have been pondering for a while, at least since this 1995 article on "Service-Oriented Science" --but substitute "cloud" for "grid" :-) people.cs.clemson.edu/~johnmc/courses/cpsc950/814.pdf