Good afternoon!
And thank you, Iosif and to the team of Exactpro, for inviting us to make this presentation.
I'm Justyn Trenner. I'm
responsible for the research group that we have a QA Media.
And
Just to give you a very brief background on who we are...
We undertake specialist events and research in the financial quality assurance space.
I have a team of researchers who
catalog and get a great deal of background
on all of the specialist vendors and consultants, but also undertake specific programs of research
to discover attitudes and approaches and opinions, and that's what I'm going to be calling on
today. The rest of the information you can see on the screen about myself and my partner - Matthew Crabbe who founded the business
What I'm going to present to you today is based on interviews with about a hundred
Quality Assurance professionals, like many of you in the room. In fact some of you in the room may have contributed.
The vast majority of whom
Deciders on, choosers of specific vendors and deciders on the process of Quality Assurance.
My own background includes building
quantitative trading tools
and being part of a management team that prioritized
technology projects and
gave far too little attention to the whole QA process and
gave far too little budget and time and opportunity to get the QA process done - a scenario
with which I'm sure everybody in the room is familiar.
So what's demanded of the QA teams at financial institutions?
Well, put simply, more for less. You're mostly if you're on the buy side on the client side of this equation,
you're experiencing a pressure on budgets and that pressure is expected to be realized through
improved use of automation, by the magic words "agile" and "DevOps" of which more later, but
very undefined. At one of our events, a speaker very nicely summarized DevOps as being rather like teenage sex:
Everybody claims to know what it is,
very few people actually do know what it is,
and the number of actually done it is even less, so it's it seems rather opposite.
And automation is a little bit different. Most people do know what that is,
but doing it with intelligence is perhaps the next challenge, and just
to alarm any vendors here including our hosts, vendor renegotiation is part of where some of those savings are coming from but
Hold far, because I'm going to give you slightly better news in a moment.
Because the other pressure that we are as a community under...
Pressure to deliver is of course 'Better software sooner' - the title of the
presentation.
The ultimate requirement from QA is we don't want buggy stuff out there.
But by the way, we want it yesterday and we want better customer experience. So, how do we square that circle?
Well, the great news is we start by squaring the circle by throwing a bit more money at the problem.
And that money is coming because of competitive pressures,
we have to produce software that competes with our competition. There is regulatory pressure, we've heard a lot about that this morning and
again, those those funny words that I mentioned a few months ago
What's stopping us getting there?
Well, from talking to you and to the client-side community, we split the issues into two.
I'm going to briefly talk on this slide about the first two and then in the remainder of the presentation
I'm going to try and get into some examples of the other issues. So, the first two issues are enterprise level or strategic.
What do we mean by the pressure to innovate? On the one hand there's the pressure to do, perhaps, what
the ASX has done, which is look at alternative tools and being seen to be looking at alternative tools.
On the other hand, there's the pressure for each individual department to come up with innovation. And
that is a
problem that then contributes to the second issue, which is business prioritization...
Every business is in some level of process transformation at the moment or transformation process. And there is a constant pressure
to do my stuff first before you stuff, and I'm going to come back to that issue, but that
competing business pressure to get
prioritization is a key on a strategic level. If we break that down to a tactical level, then
what are the issues that we start to see? Well, we see the detail around innovation and business demands,
we also see the detail around regulatory
pressures, and Yvette talked us through that a little bit.
I'm going to say a bit more about that. And we see the ongoing pressure to deal with multiple legacy
systems and legacy fiefdoms
and
I will come into this in a bit more detail.
What does this mean specifically in the world that we're focusing on here today, which is the trading world?
Well, quite often prioritization isn't done against the clear business vision.
There is not a clear well articulated direction or travel for the business as a whole.
There's no let-up in regulatory pressures, data is all over the place and data is not consistent.
I'm sure that just about everybody in this room has had experience of
trying to normalize
four decimal-place data or four
significant figure data with another system that has either unlimited or truncates to two decimal places.
But nobody saw that one coming until you put the two systems together.
And, of course, the spaghetti of systems and
the ownership of the costs to solve these problems, the ownership of the budget being in different fiefdoms.
So, what do we need to do around clarity of vision and prioritization?
Well, the bad news is of course, and I know I'm telling you to suck eggs
you know, in a sense, teaching you to suck eggs, but
because every unit wants their changes implemented, you need to have a bigger, clearer strategic vision,
which very often those of us in QA aren't quite at the table to affect.
But I've certainly been in organizations and I'm sure many of you have, where the person who has the CEOs ear
gets their change done first, or the customer who went out
on the golf course with the CEO or with the senior manager last weekend gets their change made first,
rather than a clear alignment of
requirement for change to the business priority in the business strategy.
What also then happens is that the resources associated with testing
actually get sucked up into development to get transferred and you're left with inadequate resources to test, testing is done poorly and
the activity overload that then hits testing means that you're wasting time doing nice-to-haves
it would be nicer if this button were green and not blue rather than need-to-have, if we don't make sure that this process is
delivered, we're going to get some bloody great fine for
embargo filtering, not having actually taken place in this payment system. And
that's a real example from at least three banks that I can think of as I'm standing here.
So resource is going to the wrong place.
So the solution not necessarily within control in this room,
but the solution is clearly around around number one clear business prioritization.
But number two - having clear resource allocation.
And we see time and again that lack of centralized resource over key parts of testing
gets in the way of effective testing
Regulation jumping up apparently at the last minute and
MiFID
compliance and
also the
Dodd-Frank implementation - two
perfect examples of this where the regulations were being negotiated right up to within a few weeks of them coming into a force
Which was a great excuse for the businesses to say we'll sort that out when we've actually know what we meant to be doing
Which then meant that actually getting the systems built in time was enough of a challenge, let alone
leaving any of you time to test anything, so
the need to do scenario-based planning...
We've heard a few examples and in the trading world, especially
things like algo testing being put under pressure by the regulators before the algos are put into
production use, that's something the we already see.
From what the conversations I'm having, I'm seeing the regulator's getting more and more interested in the minutae of testing and
in my experience over the last 20 years, it's interesting how frequently
BaFin and
before them other German regulators in many ways led the way they raised the issues in regulation that then became
industry-wide issues of regulation. Algo being a great example. Well, one of the things that I'm hearing is that
I know and we've done some recent research on this. I know that many
financial institutions
believe that their contracts cover them to use production test data in test systems.
It's not at all clear to me that regulators would find that answer acceptable. Maybe their legal contracts cover them, but the
regulation around,
GDPR and other similar sets of regulation, won't cover them.
So not having your test environment set up in a way that can flexibly adapt to
what could be quite last-minute changes interpretation of regulation is a challenge?
And again, I'm going to say
allocate more resources, allocate more resources centrally and of course, the smart use of
automation, all parts of the solution
So, coming more specifically onto data. Well, data is
often way behind in
priorities on the client side
so
athough there may be central resources for certain functions, in the vast majority of organizations that we've spoken to,
test data is the
provenance and the challenge of the specific, either the application manager or the application test manager.
And that's a huge problem because the test data that
that application test manager needs to use is, of course, going to come from multiple different environments and be in multiple different formats and
also require multiple different levels of expertise.
To do a 10 a test with out doing appropriatly dirty data is
an inadequate test, as you're all aware, but what is appropriate dirtiness? What's appropriate dirtiness?
Of course in market data, it's different appropriate dirtiness, in
payments data
etc, etc. So the ability to pull together
appropriate frame of test data. Otherwise, we
We're aware that something like 74 percent of funds that we speak to claim that they now work on an agile basis.
But their data for those tests is not agile,
except in about 10 percent of cases.
Moving to DevOps, if your data isn't framed and ready, isn't it just not going to happen. So,
data needs to be seen not just as a defensive in a compliance requirement, but also as an opportunity.
Working across systems - again, an issue that everybody in the room will have had experience of.
Particular challenge, is that segments of a workflow
are available for testing, but a complete integrated
workflow with all the latest versions are often not available in a
development or test environment. And I've seen this many many times
where the tests that were carried out in
the segment that the application manager was responsible for worked just fine,
but when it was an end-to-end test or
when the system was put into production end-to-end, it didn't work. And so often,
so very often it comes down to firewall issues and
interconnections that just haven't been adequately foreseen and tested.
And then this is the slide of shameless self-promotion.
What are we wanting? What are we looking to contribute to the debate?
Well, what are the things that we want to contribute is looking at how effectively
vendors and
consultants are contributing to that core goal of getting better software to market sooner.
So we have taken
fairly standard approaches,
but we believe combine them in an innovative way
to look at the impact that first who scores the impact that
the vendors have and then to look at the satisfaction, they generate from their clients.
By way of an indication, of course, we can break this down by individual
system integrators and individual specialist vendors, but by way of an interesting overall
overview of the data that we have, we looked at the combination of all
consultants and SIs as one group and the combination of all specialist vendors as another. So,
Capgemini or Accenture might be
represented in the system integrators and you can probably think of many others
and their friends at Exactpro and others would of course be
represented in the other.
Danger of what I'm presenting - I'm giving you an average of averages -
imagine everybody knows the challenge of that.
However, that said there is one there are a couple of interesting
points. In general, system integrators and
consultants do work across the whole with the system, the whole of the architecture and therefore they generate a certain
loyalty, but also I believe that's because they have more control of the entire process and more influence over the process and the thinking around
the process
Specialist vendors, you can see from our chart,
without any great need to go into detail,
seem to have less impact.
Of course,there are some who have great impact and some who have less. However,
they are prey to one very specific problem and one anecdote around this will will illustrate the problem.
There's a vendor that I imagine most of you in this room would know - it's not Exactpro,
who was retained about three years ago by a very familiar name,
a very familiar global bank and about a year or two later they were let go.
They still have a great dialogue and the specific reason was that the bank
concerned never got their proverbial together to be able to exploit and benefit from
the vendor.
The issues that were blocking getting good value out of the vendor were
endogenous
to the way that that Bank worked, whereas where you are a
broader based
consultant, of course, you may have more influence over that
overall process and have more places within the process that you can score points and be successful.
And in fact, if we look at the middle of the satisfaction ratings, we find, unsurprisingly, that the specialists are just as expert, and
one might argue in their very specific domains often more expert than the consultants.
This is a
2018 set of releases that we'll be doing on
the impact of the vendor community on the QA processes of the
client side.
A 2019 program, that I hope that we'll get to work with some of you on, is to actually come up with objective benchmarks
for the
client side, as well to benchmark the overall process and help identify
from an overview perspective the strengths and weaknesses of the way that QA is being implemented against the
organization and then being implemented as a discipline in itself.
So pulling that together:
The first
The first conclusion is really business 101: You need a business strategy,
you can't have technology strategy till you have a business strategy
and you can't have a viable testing strategy and testing priorities
until you have the technology priorities.
You need an enterprise-wide view, not purely a local view of so much that feeds into testing.
You need not just automation
but you need smart automation, which of course means AI
and adaptive learning and,
again, sing my own song sheet. You need more complete
benchmarks to measure whether the vendors you're using
and the processes you're using are really making the difference they're intended to make.
Thank you any questions any comments?
Thank you,Justyn!
Chart of
system integrators against vendors. There was no cost involved in that, I would suggest that
the cost of system integrators against vendors would be
a multiple of two or three times.
I think it's deceptive to put that out without any financial impact on that.
I think that that's fair actually, it definitely wasn't apparent in the way I presented, but there is a value for money component
embedded within that and you're absolutely right. The devil is in the detail.
Thank you!
So and one question from me: why is your project named The QA Vector 500?
Why why not 1,000 or 250 or ...?
Well, to to reveal one or two trade secrets we decided, when we started the project
just about a year ago now actually,
we had in our database about
220 specialist vendors in the space.
By
the end of last year we had about 380.
We have some we had when we decided to how to name it was 400, and we thought five hundred gave us latitude.
But it's invited five hundred is a number that we can get our heads around.
Right, thank you!
Any more questions to Justyn?
No, Okay. Thank you Justyn.
Thank you. Thank you very much!
Không có nhận xét nào:
Đăng nhận xét