Nagwa, an EdTech startup company in the field of mathematics education based in Windsor, UK, announces a new vacancy for a (native Spanish speaker) candidate with a PhD in mathematics.
The job title is “Mathematics Content Writer (Spanish)” and the salary bracket is £30,000–36,000. Full details about the job and a link for interested or potential applicants to apply is available at this link:
Software engineer Jean E. Sammet, who co-designed the Common Business Oriented Language (COBOL) and was elected the first female president of the ACM in 1974, passed away on May 20 at the age of 89. Sammet achieved a level of prominence in computing beyond most women of her generation, and she once said her ambition was “to put every person in communication with the computer,” according to University of Maryland professor Ben Shneiderman. The Computer History Museum’s Dag Spicer says Sammet’s book, “Programming Languages: History and Fundamentals,” published in 1969, “was, and remains, a classic” in the field. COBOL remains an essential element in the mainframes underlying corporate and government agency operations worldwide. Sammet worked with five other programmers designing COBOL over a period of two weeks, and the language enabled innovative techniques for describing and representing data in computer code. Sammet later worked to inject more engineering discipline into the language.
More info here: The New York Times, by Steve Lohr
Researchers at Salesforce have developed an algorithm that applies machine-learning techniques to accurately and coherently condense lengthy textual documents, technology which could impact fields such as law, medicine, and scientific research. The algorithm blends various strategies, including supervised learning, by being fed summary examples, while also applying an artificial attention mechanism to the text it is receiving and generating. The process ensures the system will not return too many repetitive strands of text, which has been an issue for other summarization programs. In addition, the system conducts experiments to produce its own summaries via reinforcement learning. Northwestern University professor Kristian Hammond lauds the Salesforce algorithm, but says it also illustrates the limits of solely relying on statistical machine learning. “We need a little bit of semantics and a little bit of syntactic knowledge in these systems in order for them to be fluid and fluent,” Hammond says.
More info here: Technology Review
Researchers at the Korea Advanced Institute of Science and Technology (KAIST) have developed a tactile sensor composed of silicon and carbon materials that can serve as a skin for robots, absorb shocks, and differentiate between various forms of touch. The researchers combined silicon and carbon nanotubes to produce a composite, which was paired with a medical-imaging technique called electrical impedance tomography. The team says the new material can distinguish between the location and the size of various forms by touch. In addition, it can withstand strong force, as well as function as a three-dimensional computer interface and tactile sensor. The researchers also note it can be reused even after partial damage to the sensor by filling and hardening the damaged region with composite. “This technology will contribute to the soft robot industry in the areas of robot skin and the field of wearable medical appliances,” says KAIST professor Jung Kim.
More info here: EE Times Asia
Computers have for the first time trained themselves to cooperate in games in which the goal is to achieve the best possible outcome for all players. Brigham Young University professor Jacob Crandall and colleagues brought humans and computers together to play digital versions of chicken, prisoner’s dilemma, and a third collaborative game called “alternator.” Teams consisted of two people, two computers, or one human and one computer. Twenty-five different machine-learning algorithms were tested, but no one algorithm was capable of collaborating. The researchers then imbued communicative ability among the computers by adding 19 prewritten phrases to be sent back and forth between partners after each term. Over time, the computers had to learn the phrases’ definition in the context of the game. The S# algorithm learned to cooperate with its partner in a few turns, and the machine-only teams cooperated at a higher rate than humans by the end of the game.
More info here: Science – Jackie Snow
Researchers at the University of Bayreuth in Germany have developed Hexagonal Boron Carbon Nitrogen, a two-dimensional (2D) material that could revolutionize electronics. The researchers say the new material, which features semiconductor properties, could be better suited for high-tech applications than other conventional materials, such as graphene. “Our development can be the starting point for a new generation of electronic transistors, circuits, and sensors that are many times smaller and more flexible than previous electronic elements,” says Bayreuth professor Axel Enders. Although graphene is extremely stable and serves as an excellent conductor of heat and electricity, electrons flow unhindered at any electrical voltage, meaning there are no defined “on” and “off” states. The Bayreuth researchers attempted to solve this problem by replacing individual carbon atoms in the material with boron and nitrogen, in such a way they were able to form a two-dimensional lattice with semiconductor properties.
More info here: University of Bayreuth, Christian Wissler
An exascale supercomputer will likely be realized within the Trump administration’s first term, which could be a tipping point for the U.S. Supercomputing is viewed as essential to national competitiveness because of the increasingly virtual nature of research and product development. Europe has set an exascale delivery schedule of 2022 along with a $749-million commitment, while both China and Japan aim to have a system ready by 2020. China is using its own microchips, while a European system in development uses ARM processors. The Obama administration initially set a 2023-2024 target date for exascale, but amended it in its final weeks to 2021, with a projected budget of $3.1 billion to $5.7 billion. Argonne National Laboratory’s Paul Messina says the U.S. Department of Energy’s Exascale Computing Project “is now a seven-year project, not a 10-year project, but it will cost more.” China currently has the world’s fastest supercomputer, running at about 125 petaflops. Although the U.S. exascale project’s goals include contributing to the country’s economic competitiveness and supporting national security, another objective is developing a software stack, in collaboration with vendors, that smaller systems in industry and academia can utilize.
More info here: Computerworld (01/20/17) Patrick Thibodeau