Jon Payne, database engineer, InterSystems
Months have passed since Devoxx 2018, but some of the conversations I had with the developers visiting our stand there still stick in my mind, probably because I found one or two of these talks so surprising.
One of the main topics for discussion was building cloud native applications. Of course, I had known that a build-your-own approach here was becoming popular, but I hadn’t appreciated the level at which it had permeated the entire development ecosystem.
Many of the developers I spoke to had a couple of things in common; they all came from organisations that owned huge volumes of data and they all needed to create a process involving both analytical and transactional workloads with that data using a variety of different technologies.
These analytical workflows were varied in nature. Many were finding that being able to run SQL, for example, wasn’t enough to solve the functional and non-functional requirements of queries in an adequate manner.
Consequently, a number of developers were building their own cloud native data management platform. What I found interesting was the number of different organisations feeling the need to do this given there is such a wide variety of cloud native – and also on premise platforms – out there that are well-known and are in the SQL and NoSQL space. Yet, they can find nothing on the market to suit their needs.
I find this remarkable because they may see it as the most cost-effective option to begin with, but, it is likely to turn out a much less economic option in the long-term. What these developers are building is very specific to a particular problem – and as far as addressing their challenge it is likely to be an initial success. However, there is considerable risk inherent in this approach.
All will be well if those developers building the solution remain with their organisation. However, if they decide to leave – and let’s face it the competition for developers couldn’t be stronger – then their existing employer either has to offer them more money, or face a knowledge vacuum surrounding the platform, possibly having to bring in expensive consultants to the rescue.
The other issue is a matter of functionality. Once the organisation wants to do something extra with the platform they will need to set up a data pipeline and replicate it in another data store, reconstructing it along the way. Before they know where they are, they have the same data replicated in four or five different structures. Suddenly, what started out as a cost-effective platform developed for a particular purpose has become both expensive and complex.
Interestingly, this was one of the reasons several developers told me they are not going cloud native. This ramping up of cost and complexity is not easy to manage. Besides, if you consider Amazon S3, AWS or any of the other low-cost cloud platforms, they are not fast mechanisms. It might take 20 seconds to do a read and there is no performance guarantee. If a business is data-intensive then it does beg the question as to whether cloud native is the right route.
This is especially the case if an organisation needs specific hosting requirements. For example, a healthcare company may need to be connected to N3 or HS3N. If so, the costs will rise dramatically as the data can’t be accommodated in large-scale AWS racks but must be kept apart.
Of course, there is a plus side, especially if a developer makes use of all available services offered by the cloud provider. This can significantly reduce the time to build and deploy a solution – and ensures it is highly scalable. However, this does tie the organisation to a particular cloud provider as, if a solution is built, for example, in AWS, it can’t be moved. Then as transactional values increase, data volumes grow and complexity intensifies, the costs can increase again quite dramatically.
Traditionally in the database market we used to talk about ‘asset compliance’. In the cloud world, because of the way the infrastructure works, providers are unable to offer this as we used to know it. Instead the big providers have redefined the concept so they can comply. While in some cases this may not be important, it can bring a whole host of issues to the fore when building apps that are critically dependent on the consistency and quality of data such as real-time clinical applications.
Yet despite all these drawbacks, developers are still building cloud native applications because they can’t find what they want on the market. This bodes well for solutions such as InterSystems’ IRIS Data Platform which, with its flexible architecture can meet varied transactional and analytical workloads and interrogate data in a number of different modes, not just in SQL or object or document-based.
What could also make IRIS so valuable in these cases is its interoperability; in particular, its ability to integrate data and applications into seamless, real-time business processes. It can also cut through the complexity, collapsing the tech stack and managing multiple open source components, all in a highly cost-effective manner.
Perhaps I shouldn’t have been so surprised at the number of developers at Devoxx building their own cloud native applications. After all, they are rather a self-selecting band, given the nature of the event.
However, the most exciting aspect here is not that that such a technically inquisitive and inventive group are doing it themselves, more that they are being forced to do so despite the shortcomings of cloud native because of a gap in the market. Which all means current new market developments are certainly moving in the right direction.
‘Spooky’ AI tool brings dead relatives’ photos to life
By Umberto Bacchi
(Thomson Reuters Foundation) – Like the animated paintings that adorn the walls of Harry Potter’s school, a new online tool promises to bring portraits of dead relatives to life, stirring debate about the use of technology to impersonate people.
Genealogy company MyHeritage launched its “Deep Nostalgia” feature earlier this week, allowing users to turn stills into short videos showing the person in the photograph smiling, winking and nodding.
“Seeing our beloved ancestors’ faces come to life … lets us imagine how they might have been in reality, and provides a profound new way of connecting to our family history,” MyHeritage founder Gilad Japhet said in a statement.
Developed with Israeli computer vision firm D-ID, Deep Nostalgia uses deep learning algorithms to animate images with facial expressions that were based on those of MyHeritage employees.
Some of the company’s users took to Twitter on Friday to share the animated images of their deceased relatives, as well as moving depictions of historical figures, including Albert Einstein and Ancient Egypt’s lost Queen Nefertiti.
“Takes my breath away. This is my grandfather who died when I was eight. @MyHeritage brought him back to life. Absolutely crazy,” wrote Twitter user Jenny Hawran.
While most expressed amazement, others described the feature as “spooky” and said it raised ethical questions. “The photos are enough. The dead have no say in this,” tweeted user Erica Cervini.
From chatbots to virtual reality, the tool is the latest innovation seeking to bring the dead to life through technology.
Last year U.S. rapper Kanye West famously gifted his wife Kim Kardashian a hologram of her late father congratulating her on her birthday and on marrying “the most, most, most, most, most genius man in the whole world”.
‘ANIMATING THE PAST’
The trend has opened up all sorts of ethical and legal questions, particularly around consent and the opportunity to blur reality by recreating a virtual doppelganger of the living.
Elaine Kasket a psychology professor at the University of Wolverhampton in Britain who authored a book on the “digital afterlife”, said that while Deep Nostalgia was not necessarily “problematic”, it sat “at the top of a slippery slope”.
“When people start overwriting history or sort of animating the past … You wonder where that ends up,” she said.
MyHeritage acknowledges on its website that the technology can be “a bit uncanny” and its use “controversial”, but said steps have been taken to prevent abuses.
“The Deep Nostalgia feature includes hard-coded animations that are intentionally without any speech and therefore cannot be used to fake any content or deliver any message,” MyHeritage public relations director Rafi Mendelsohn said in a statement.
Yet, images alone can convey meaning, said Faheem Hussain, a clinical assistant professor at Arizona State University’s School for the Future of Innovation in Society.
“Imagine somebody took a picture of the Last Supper and Judas is now winking at Mary Magdalene – what kind of implications that can have,” Hussain told the Thomson Reuters Foundation by phone.
Similarly, Artificial Intelligence (AI) animations could be use to make someone appear as though they were doing things they might not be happy about, such as rolling their eyes or smiling at a funeral, he added.
Mendelsohn of MyHeritage said using photos of a living person without their consent was a breach of the company’s terms and conditions, adding that videos were clearly marked with AI symbols to differentiate them from authentic recordings.
“It is our ethical responsibility to mark such synthetic videos clearly and differentiate them from real videos,” he said.
(Reporting by Umberto Bacchi @UmbertoBacchi in Milan; Editing by Helen Popper. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers the lives of people around the world who struggle to live freely or fairly. Visit http://news.trust.org)
Does your institution have operational resilience? Testing cyber resilience may be a good way to find out
By Callum Roxan, Head of Threat Intelligence, F-Secure
If ever 2020 had a lesson, it was that no organization can possibly prepare for every conceivable outcome. Yet building one particular skill will make any crisis easier to handle: operational resilience.
Many financial institutions have already devoted resources to building operational resilience. Unfortunately, this often takes what Miles Celic, Chief Executive Officer of TheCityUK, calls a “near death” experience for this conversion to occur. “Recent years have seen a number of cases of loss of reputation, reduced enterprise value and senior executive casualties from operational incidents that have been badly handled,” he wrote.
But it need not take a disaster to learn this vital lesson.
“Operational resilience means not only planning around specific, identified risks,” Charlotte Gerken, the executive director of the Bank of England, said in a 2017 speech on operational resilience. “We want firms to plan on the assumption that any part of their infrastructure could be impacted, whatever the reason.” Gerken noted that firms that had successfully achieved a level of resilience that survives a crisis had established the necessary mechanisms to bring the business together to respond where and when risks materialised, no matter why or how.
We’ll talk about the bit we know best here; by testing for cyber resilience, a company can do more than prepare for the worst sort of attacks it may face. This process can help any business get a clearer view of how it operates, and how well it is prepared for all kinds of surprises.
Assumptions and the mechanisms they should produce are the best way to prepare for the unknown. But, as the boxer Mike Tyson once said, “Everyone has a plan until they get punched in the mouth.” The aim of cyber resilience is to build an effective security posture that survives that first punch, and the several that are likely to follow. So how can an institution be confident that they’ve achieved genuine operational resilience?
This requires an organization to honestly assess itself through the motto inscribed at the front of the Temple of Delphi: “Know thyself.” And when it comes to cyber security, there is a way for an organization to test just how thoroughly it comprehends its own strengths and weaknesses.
The Bank of England was the first central bank to help develop the framework for institutions to test the integrity of their systems. CBEST is made up of controlled, bespoke, intelligence-led cyber security tests that replicate behaviours of those threat actors, and often have unforeseen or secondary benefits. Gerken notes that the “firms that did best in the testing tended to be those that really understood their organisations. They understood their own needs, strengths and weaknesses, and reflected this in the way they built resilience.”
In short, testing cyber resilience can provide clear insight into an institution’s operational resilience in general.
Gaining that specific knowledge without a “near-death” experience is obviously a significant win for any establishment. And testing for operational resilience throughout the industry can provide some reminders of the steps every organization should take so that testing provides unique insists about their institution, and not just a checklist of cyber defence basics.
The IIF/McKinsey Cyber Resilience Survey of the financial services industry released in March lasy year provided six sets of immediate actions that institutions could take to improve their cyber security posture. The toplines of these recommendations were:
- Do the basics, patch your vulnerabilities.
- Review your cloud architecture and security capabilities.
- Reduce your supply chain risk.
- Practice your incident response and recovery capabilities.
- Set aside a specific cyber security budget and prioritise it
- Build a skilled talent pool and optimize resources through automation.
But let’s be honest: If simply reading a solid list of recommendations created cyber resilience, cyber criminals would be out of business. Unfortunately, cyber crime as a business is booming and threat actors targeting essential financial institutions through cyber attacks are likely earning billions in the trillion dollar industry of financial crime.A list can’t reveal an institution’s unique weaknesses, those security failings and chokepoints that could shudder operations, not just during a successful cyber attack but during various other crises that challenge their operations. And the failings that lead to flaws in an institution’s cyber defence likely reverberate throughout the organization as liabilities that other crises would likely expose.
The best way to get a sense of operational resilience will always be to simulate the worst that attackers can summon. That’s why the time to test yourself is now, before someone else does.
Thomson Reuters to stress AI, machine learning in a post-pandemic world
By Kenneth Li and Nick Zieminski
NEW YORK (Reuters) – Thomson Reuters Corp will streamline technology, close offices and rely more on machines to prepare for a post-pandemic world, the news and information group said on Tuesday, as it reported higher sales and operating profit.
The Toronto-headquartered company will spend $500 million to $600 million over two years to burnish its technology credentials, investing in AI and machine learning to get data faster to professional customers increasingly working from home during the coronavirus crisis.
It will transition from a content provider to a content-driven technology company, and from a holding company to an operational structure.
Thomson Reuters’ New York- and Toronto-listed shares each gained more than 8%.
It aims to cut annual operating expenses by $600 million through eliminating duplicate functions, modernizing and consolidating technology, as well as through attrition and shrinking its real estate footprint. Layoffs are not a focus of the cost cuts and there are no current plans to divest assets as part of this plan, the company said.
“We look at the changing behaviors as a result of COVID … on professionals working from home working remotely being much more reliant on 24-7, digital always-on, sort of real-time always available information, served through software and powered by AI and ML (machine learning),” Chief Executive Steve Hasker said in an interview.
Sales growth is forecast to accelerate in each of the next three years compared with 1.3% reported sales growth for 2020, the company said in its earnings release.
Thomson Reuters, which owns Reuters News, said revenues rose 2% to $1.62 billion, while its operating profit jumped more than 300% to $956 million, reflecting the sale of an investment and other items.
Its three main divisions, Legal Professionals, Tax & Accounting Professionals, and Corporates, all showed higher organic quarterly sales and adjusted profit. As part of the two-year change program, the corporate, legal and tax side will operate more as one customer-facing entity.
Adjusted earnings per share of 54 cents were ahead of the 46 cents expected, based on data from Refinitiv.
The company raised its annual dividend by 10 cents to $1.62 per share.
The Reuters News business showed lower revenue in the fourth quarter. In January, Stephen J. Adler, Reuters’ editor-in-chief for the past decade, said he would retire in April from the world’s largest international news provider.
Thomson Reuters also said its stake in The London Stock Exchange is now worth about $11.2 billion.
The LSE last month completed its $27-billion takeover of data and analytics business Refinitiv, 45%-owned by Thomson Reuters.
(Reporting by Ken Li, writing by Nick Zieminski in New York, editing by Louise Heavens and Jane Merriman)
UK seeks G7 consensus on digital competition after Facebook blackout
LONDON (Reuters) – Britain is seeking to build a consensus among G7 nations on how to stop large technology companies...
Britain to offer fast-track visas to bolster fintechs after Brexit
By Huw Jones LONDON (Reuters) – Britain said on Friday it would offer a fast-track visa scheme for jobs at...
GameStop rally fizzles; shares still on pace for 130% weekly gain
By Aaron Saldanha and David Randall (Reuters) – An early surge in the shares of GameStop Corp fizzled and left...
Oil drops on dollar strength and OPEC+ supply expectations
By Jessica Resnick-Ault NEW YORK (Reuters) – Oil prices fell on Friday as the U.S. dollar rose while forecasts called...
Stocks try to recover from bond whiplash, dollar gains
By Herbert Lash NEW YORK (Reuters) – Global equity markets swooned on Friday, even as the Nasdaq and S&P 500...