Subscribe Now & Stay with Us
Thanks for Watching
-------------------------------------------
Facial recognition software among FBISD potential security upgrades - Duration: 2:42.
For more infomation >> Facial recognition software among FBISD potential security upgrades - Duration: 2:42. -------------------------------------------
ATM MACHINE SWIPING SOFTWARE HACKED - Duration: 0:53.
SHARE WITH YOUR FAMILY FRIENDS
-------------------------------------------
Cake Projection Mapping Tutorial (Using Free Software) - Duration: 18:06.
Welcome to this tutorial on how to projection map a cake. If you've projection
mapped before then some of this might be a little tedious for you because I
really will be starting with the basics and giving a lot of detail about all
aspects of this process. You may have seen Luma Bakery's videos on this
channel, if you haven't, check them out. I've offered this service professionally
for over two years but I'm now moving away from this and focusing more on
teaching others how to achieve this technique and building a cake mapping
community and creating different more creative cake mapping projects for
others to try themselves. This tutorial will use free software
called MapMap. In the past where I've cake mapped on commercial projects I've
used Qlab and MadMapper but these are paid-for pieces of software so if you're
trying this for the first time, you're playing around, or you're doing a one-off
project, maybe you've borrowed a projector, you might not want to pay for
software. If you'd be interested in learning about my Qlab/MadMapper
workflow and also using multiple projectors, let me know in the comment
section and I can put together a tutorial. In the description I've
included a link to some resources you can download for free which includes a
MapMap scene that's already been set up, you just need to map it to your specific
cake - that's if you don't want to go through these steps yourself, although
I'd recommend that you do give it a try. I've also included the video content
which you are free to use for your personal projects. I've also listed all
the hardware and equipment I'm using, including the make and model of the
projector down below. In order to successfully cake map you're obviously
going to need a cake. Mine was made for me by a professional baker, you might
want to get a professional to make yours. If you wanted to create a full dummy
version yourself you're going to need styrofoam dummies covered in white
fondant icing, links to these materials are below. The cake I've always worked
with has five tiers, all five inches high, starting with an 8-inch square base at
the top then getting larger by two inch increments as you go down: so 10 inches,
12 inches, 14 inches, with 16 inches at the bottom. These are the dimensions that
are going to work with the content provided here, or a version that you
scale proportionally with these ratios. You might also just want to project onto three or four tiers, that would work too.
A few things to note are that even
though they are polystyrene inside they are very heavy especially the largest one
so transporting them and moving them
around can be tricky. They do keep. I've had these for two and a half years now,
but they are really showing their age with some marks and discoloration and
I'd consider these well past being acceptable for a paying job. One thing I
found is that the icing sweats over time. Another baker has recommended a type of
fondant icing that is designed for tropical temperatures and would be more
resistant to this problem in the long run. I've put the link to that icing down
below. For commercial jobs - weddings for example - where the couple wants to cut
the cake at least one tier has to be real. To keep things simple for me and
most cost-effective for the client, I would tend to offer the top 8-inch tier
as real cake and the remaining four tiers as dummies and then provide
however many slices of cut cake were needed to make up the numbers beyond
what you get from the top tier which is about 35 to 40 wedding portions or 16
dessert portions. You'll find that the edges and corners of real cake are
softer and you won't have as crisp lines to work with as you do on the dummies
which is another reason to use fake cake where possible. The projector I'm using
is a short throw, 2200 lumen projector by Ben Q. It's been discontinued now and
superseded by another model which I've linked to below. I really like this
projector and it's served me really well. One thing I really like is that it has a
six speed color wheel which reduces the rainbow effect you get with some
projectors and produces really rich bright colours. Like I said, it has a short
throw lens which is what I wanted for the setup I was going for. That means it
can be much closer to the projected surface than a typical projector you
might use for a home cinema. You maintain a lot of brightness by projecting from
close range and it's harder for people to get in the way of the projections if
you're setting the projector up on a tabletop, not rigging it overhead on a
truss, for example. I personally wasn't interested in using trusses because
they're more of a challenge logistically and I think they can sometimes be
unsightly and imposing on a space. There are some disadvantages
though to projecting from close quarters rather than from overhead
such as the risk of the audience knocking the projectors so you'll have
to decide what works best for you. I want to talk briefly about the image asset
I've provided called numbered UV. This is something that's going to help us during
the mapping phase. It might look confusing at first and this is a more
sophisticated approach than is actually needed, but I believe it's good practice
to get the most out of your projector and make your projections look the best
they can be. The reason it's set up like this is to
maximize the number of pixels we're utilizing within the projector's
resolution. My projector is full HD also known as 1080p which means it has a
resolution of 1920 by 1080 pixels. I want to put as many of those pixels to work
by splitting apart the tiers and packing them as efficiently as possible within
the 1920 by 1080 space. If you're confused it will become clear in a
moment. The tiers are numbered 1 to 5, with one at the top and 5 at the bottom
to help you know which section of content is going on which surface of the
cake. Time to set up the projector. You can mount the projector on a tripod or
just sit it on a table. I use a free-standing plinth I built myself for
this purpose. You want to set up the projector far enough away so that
projected light covers the cake from top to bottom, but near enough, again, so as to
take advantage of all the pixels the projector has to offer. As for how far
away to set up the projector from the cake, you can work this out by trial and
error by moving the projector nearer and further away, or if you're not afraid of
a bit of maths, look at your projector's specification and find out it's throw
ratio. This number represents the distance of the projector from the
surface, divided by the width of the projected image in meters.
The smaller the throw ratio number, the closer your projector can be to the cake.
For example, as I said my projector has a short throw lens and its throw ratio is
0.69 to 0.83. It has a range because its lens has an
optical zoom. Though many lenses are fixed so you'll only have one number to
work with. To work this through with my own projector as an example, I'll take
the smaller number and I know that that equals the minimum distance the
projector can be from the cake, divided by the width of the projected image. Now
I have five tiers, each five inches high, so the cake has a
total height of 25 inches or 0.635 meters. I also know that the aspect ratio
of the projector is 16:9, that's a ratio of the output's width to its height.
That means for every 16 pixels of video has in width, there are 9 pixels in
height. I you're still with me, for a piece of video content 0.635
meters high like ours needs to be it will be 1.13
meters wide. To get the distance between the cake and my projector, I plug this
into the equation and I discover that I can project from 0.78
meters from the cake. Trial-and-error is obviously less
complicated and fine for most people, but sometimes you can't scout out an event
location beforehand, or you don't have lots of set-up time on the day and it's
good to plan ahead and know how long your cables need to be, for example. Some
projectors can be turned on their side and project in portrait mode. This would
actually be better for us because our cake is taller than it is wide.
However my projector isn't designed to do this so I've kept it landscape.
Plug your projector into a power source and turn it on. Connect it to your laptop or
computer. I'm using a MacBook Pro, specifications below, which is powerful
enough even for heavy content and outputting to multiple projectors. I'm
using a mini DisplayPort to HDMI cable which goes in the back of the projector
in HDMI slot one. It's useful to turn off any lights and reduce ambient light at
this point so you can see the projection better now that it's time to start mapping.
The first thing to mention is that I will be working on a Mac so some of these steps,
especially when setting up the projector and your display, might be a little bit
different for you if you're working on a PC, for example. But if you are on a Mac,
you want to come down here into System Preferences and go to Displays I can
already see my projector here it is BenQ and I want to make sure it's optimized
for it and not my laptop's display. In Arrangement I want Mirror Displays
turned on in the documentation for MapMap it actually says that it doesn't
want Mirrored Displays on but I found that actually this is the way to make it
work. So opening up MapMap and this is what we see: not very much. And the first
thing that I want you to do is to come up here into the top left and click on
this little piece of film "import a video or image file" and I want you to bring
two things in the first thing is this "numbered UV" image that we've talked
about before, here it is now sitting in your library, and the second thing is the
video we actually want to put onto the cake so "cake mapping projected content"
and there they both are in your library but you don't see anything in either of
your editors yet until you press Add Mesh Layer here this plus on top of a
rectangle. So here's the particle cascade that we're going to be putting on a cake
you can pause and play it over here on the top right and here in the inspector
we see its source is the "projected content" if we change that to "numbered UV"
then we see this instead. If you haven't done any mapping at all before
then a basic idea really is to use this bounding box to isolate sections of the
video so that we can manipulate just that section onto an individual surface
separate from the other portions of the video just by moving these corner
handles around you can see we're manipulating the output so that we can
stick it to the corners of our surface that we've chosen in this case it's the
front face of the bottom tier on the left.
Now just a couple of things about working inside MapMap, if you can't see
these handles or if you can't see the crosshairs then you want to come up here
to View and just toggle on Display Controls and then you will be able to
see them. Another thing working inside the input
editor or inside both editors if you want to zoom in and out you can do using
the scroll wheel of your mouse if you have a mouse that has a scroll wheel
alternatively you can use the plus and minus magnifying glasses just on the
toolbar there of both editors. Also MapMap has various modes that you can use
to manipulate these handles and the bounding box around the video;
We're in Move currently so we're moving these handles that can be accessed
through the hotkey M. If you maybe want to scale the bounding box with the
hotkey S you can do that or R is rotate. Alternatively, if you actually click
inside the bounding box you cycle through the different modes so you would
just click repeatedly until you're on the mode that you want. Let's start as we
mean to go on by renaming our first mesh "left five" just so that we can work cleanly
and we know what we're doing. Now I want to zoom right in and I just want to make
these corners as neat as they possibly can be so we're not leaving any of the
content overhanging or leaving it outside our box but we're also not including any
black borders or any slivers of the black content that we don't want so actually
something I should have said before is that if the software hasn't
automatically detected your projector when you toggle full screen and output
your content, you want to come up here into View and output screen you want to
make sure that your projector is set to the primary output and not your laptop
or computer's display. Okay so now in the output editor I'm
going to use these handles just to move these around and make sure they're
really precisely positioned on the cake I don't want it spilling on to any of
the other surface, I don't want it going over the lip of the cake and I also
don't want any light on the back wall because that will be distracting and not
what we want. Okay so I'm happy with five and I'll move on to four so we can
duplicate "left five" because it's much the same just a rectangle box and rename
it "left four" and just zoom out so that we know what we're looking at a little
bit better, so four is over here and it's rotated 90 degrees clockwise so we want
to rotate five ninety degrees clockwise and then we'll just move it over on top
of four and simply go through the same process again of just moving these
handles so that they snugly fit around the content that we want to put onto four
Okay so when we're happy with four we'll do exactly the same thing: duplicate left
four, rename it "left three", simply move it across, tweak the handles so that they
fit around tier three and just move those handles ever so slightly just so
that they fit around content box three, output onto the cake and move those
handles until we're happy.
Now I think you get the idea
so I'll just fast forward a little bit whilst I do the two remaining tiers on the left side.
Okay so I'm happy with that all on the left, I'll just rename this "left one" and
the beauty of the content we're using the particle cascade is that all the
content on the left can be mirrored over on to the right so if we duplicate left
and name it "right one" all we need to do is take this box here and mirror it over
onto the right. Now I found the software to be a little bit funny about doing
this, so I found a method that worked for me was taking these two left hand
handles and bringing them close in like this to the right hand handles
and then I found that the remaining ones sort of pop over to the
other side and then suddenly the software's happy for you to do that so
if you try and follow my method what I just did I don't think you should have a
problem. So there we have it, it's now reflected on to the other side it's a
mirror image so it's exactly the same as before just tweak the handles except for
the right-hand side and then we just move down the cake doing exactly the
same as before.
So there we are each front face of each tier has its own piece of content
exactly like we wanted. So now probably what you want to do is
just go in here and finely tune your handles and your mapping just so that
you're happy with it and it's as precise as you want it to be. If I was doing this
on a real job I'd really take care at this stage I want it to look good and I
don't want any stray bits of content getting onto the walls or getting onto
any surfaces where they shouldn't be so I'd really take care in this stage I'll
be fairly quick now for the purpose of this tutorial but you really just want
to go through each one and just really make sure things line up. Once that's
done you want to save your file we don't want to lose all of our good work so I'm
happy to keep this name I'll just save over one that I did earlier. And now all
that's left to do is down here in our source is to change it back from "numbered
UV" back to the particle cascade. So we've done that for "right five" mesh and we
will just go through and do that for all the others now.
Okay that's done. So with any luck if we play the footage and then output it to
a projector - whoops okay - we can see what we've forgotten to turn off which is up
here in View, toggle off those Display Controls. They were useful before but now
we don't want to see them anymore. Play our content and there it is on our
cake looking beautiful. So congratulations, you've mapped a cake.
This content is looped so it will just go round and round until you want to
turn it of. If you've enjoyed this, please subscribe and like this video.
Have fun and happy cake mapping.
-------------------------------------------
EXTENT-2018: The Route to Better Software, Sooner: A Perspective from QA Vector Research - Duration: 19:41.
Good afternoon!
And thank you, Iosif and to the team of Exactpro, for inviting us to make this presentation.
I'm Justyn Trenner. I'm
responsible for the research group that we have a QA Media.
And
Just to give you a very brief background on who we are...
We undertake specialist events and research in the financial quality assurance space.
I have a team of researchers who
catalog and get a great deal of background
on all of the specialist vendors and consultants, but also undertake specific programs of research
to discover attitudes and approaches and opinions, and that's what I'm going to be calling on
today. The rest of the information you can see on the screen about myself and my partner - Matthew Crabbe who founded the business
What I'm going to present to you today is based on interviews with about a hundred
Quality Assurance professionals, like many of you in the room. In fact some of you in the room may have contributed.
The vast majority of whom
Deciders on, choosers of specific vendors and deciders on the process of Quality Assurance.
My own background includes building
quantitative trading tools
and being part of a management team that prioritized
technology projects and
gave far too little attention to the whole QA process and
gave far too little budget and time and opportunity to get the QA process done - a scenario
with which I'm sure everybody in the room is familiar.
So what's demanded of the QA teams at financial institutions?
Well, put simply, more for less. You're mostly if you're on the buy side on the client side of this equation,
you're experiencing a pressure on budgets and that pressure is expected to be realized through
improved use of automation, by the magic words "agile" and "DevOps" of which more later, but
very undefined. At one of our events, a speaker very nicely summarized DevOps as being rather like teenage sex:
Everybody claims to know what it is,
very few people actually do know what it is,
and the number of actually done it is even less, so it's it seems rather opposite.
And automation is a little bit different. Most people do know what that is,
but doing it with intelligence is perhaps the next challenge, and just
to alarm any vendors here including our hosts, vendor renegotiation is part of where some of those savings are coming from but
Hold far, because I'm going to give you slightly better news in a moment.
Because the other pressure that we are as a community under...
Pressure to deliver is of course 'Better software sooner' - the title of the
presentation.
The ultimate requirement from QA is we don't want buggy stuff out there.
But by the way, we want it yesterday and we want better customer experience. So, how do we square that circle?
Well, the great news is we start by squaring the circle by throwing a bit more money at the problem.
And that money is coming because of competitive pressures,
we have to produce software that competes with our competition. There is regulatory pressure, we've heard a lot about that this morning and
again, those those funny words that I mentioned a few months ago
What's stopping us getting there?
Well, from talking to you and to the client-side community, we split the issues into two.
I'm going to briefly talk on this slide about the first two and then in the remainder of the presentation
I'm going to try and get into some examples of the other issues. So, the first two issues are enterprise level or strategic.
What do we mean by the pressure to innovate? On the one hand there's the pressure to do, perhaps, what
the ASX has done, which is look at alternative tools and being seen to be looking at alternative tools.
On the other hand, there's the pressure for each individual department to come up with innovation. And
that is a
problem that then contributes to the second issue, which is business prioritization...
Every business is in some level of process transformation at the moment or transformation process. And there is a constant pressure
to do my stuff first before you stuff, and I'm going to come back to that issue, but that
competing business pressure to get
prioritization is a key on a strategic level. If we break that down to a tactical level, then
what are the issues that we start to see? Well, we see the detail around innovation and business demands,
we also see the detail around regulatory
pressures, and Yvette talked us through that a little bit.
I'm going to say a bit more about that. And we see the ongoing pressure to deal with multiple legacy
systems and legacy fiefdoms
and
I will come into this in a bit more detail.
What does this mean specifically in the world that we're focusing on here today, which is the trading world?
Well, quite often prioritization isn't done against the clear business vision.
There is not a clear well articulated direction or travel for the business as a whole.
There's no let-up in regulatory pressures, data is all over the place and data is not consistent.
I'm sure that just about everybody in this room has had experience of
trying to normalize
four decimal-place data or four
significant figure data with another system that has either unlimited or truncates to two decimal places.
But nobody saw that one coming until you put the two systems together.
And, of course, the spaghetti of systems and
the ownership of the costs to solve these problems, the ownership of the budget being in different fiefdoms.
So, what do we need to do around clarity of vision and prioritization?
Well, the bad news is of course, and I know I'm telling you to suck eggs
you know, in a sense, teaching you to suck eggs, but
because every unit wants their changes implemented, you need to have a bigger, clearer strategic vision,
which very often those of us in QA aren't quite at the table to affect.
But I've certainly been in organizations and I'm sure many of you have, where the person who has the CEOs ear
gets their change done first, or the customer who went out
on the golf course with the CEO or with the senior manager last weekend gets their change made first,
rather than a clear alignment of
requirement for change to the business priority in the business strategy.
What also then happens is that the resources associated with testing
actually get sucked up into development to get transferred and you're left with inadequate resources to test, testing is done poorly and
the activity overload that then hits testing means that you're wasting time doing nice-to-haves
it would be nicer if this button were green and not blue rather than need-to-have, if we don't make sure that this process is
delivered, we're going to get some bloody great fine for
embargo filtering, not having actually taken place in this payment system. And
that's a real example from at least three banks that I can think of as I'm standing here.
So resource is going to the wrong place.
So the solution not necessarily within control in this room,
but the solution is clearly around around number one clear business prioritization.
But number two - having clear resource allocation.
And we see time and again that lack of centralized resource over key parts of testing
gets in the way of effective testing
Regulation jumping up apparently at the last minute and
MiFID
compliance and
also the
Dodd-Frank implementation - two
perfect examples of this where the regulations were being negotiated right up to within a few weeks of them coming into a force
Which was a great excuse for the businesses to say we'll sort that out when we've actually know what we meant to be doing
Which then meant that actually getting the systems built in time was enough of a challenge, let alone
leaving any of you time to test anything, so
the need to do scenario-based planning...
We've heard a few examples and in the trading world, especially
things like algo testing being put under pressure by the regulators before the algos are put into
production use, that's something the we already see.
From what the conversations I'm having, I'm seeing the regulator's getting more and more interested in the minutae of testing and
in my experience over the last 20 years, it's interesting how frequently
BaFin and
before them other German regulators in many ways led the way they raised the issues in regulation that then became
industry-wide issues of regulation. Algo being a great example. Well, one of the things that I'm hearing is that
I know and we've done some recent research on this. I know that many
financial institutions
believe that their contracts cover them to use production test data in test systems.
It's not at all clear to me that regulators would find that answer acceptable. Maybe their legal contracts cover them, but the
regulation around,
GDPR and other similar sets of regulation, won't cover them.
So not having your test environment set up in a way that can flexibly adapt to
what could be quite last-minute changes interpretation of regulation is a challenge?
And again, I'm going to say
allocate more resources, allocate more resources centrally and of course, the smart use of
automation, all parts of the solution
So, coming more specifically onto data. Well, data is
often way behind in
priorities on the client side
so
athough there may be central resources for certain functions, in the vast majority of organizations that we've spoken to,
test data is the
provenance and the challenge of the specific, either the application manager or the application test manager.
And that's a huge problem because the test data that
that application test manager needs to use is, of course, going to come from multiple different environments and be in multiple different formats and
also require multiple different levels of expertise.
To do a 10 a test with out doing appropriatly dirty data is
an inadequate test, as you're all aware, but what is appropriate dirtiness? What's appropriate dirtiness?
Of course in market data, it's different appropriate dirtiness, in
payments data
etc, etc. So the ability to pull together
appropriate frame of test data. Otherwise, we
We're aware that something like 74 percent of funds that we speak to claim that they now work on an agile basis.
But their data for those tests is not agile,
except in about 10 percent of cases.
Moving to DevOps, if your data isn't framed and ready, isn't it just not going to happen. So,
data needs to be seen not just as a defensive in a compliance requirement, but also as an opportunity.
Working across systems - again, an issue that everybody in the room will have had experience of.
Particular challenge, is that segments of a workflow
are available for testing, but a complete integrated
workflow with all the latest versions are often not available in a
development or test environment. And I've seen this many many times
where the tests that were carried out in
the segment that the application manager was responsible for worked just fine,
but when it was an end-to-end test or
when the system was put into production end-to-end, it didn't work. And so often,
so very often it comes down to firewall issues and
interconnections that just haven't been adequately foreseen and tested.
And then this is the slide of shameless self-promotion.
What are we wanting? What are we looking to contribute to the debate?
Well, what are the things that we want to contribute is looking at how effectively
vendors and
consultants are contributing to that core goal of getting better software to market sooner.
So we have taken
fairly standard approaches,
but we believe combine them in an innovative way
to look at the impact that first who scores the impact that
the vendors have and then to look at the satisfaction, they generate from their clients.
By way of an indication, of course, we can break this down by individual
system integrators and individual specialist vendors, but by way of an interesting overall
overview of the data that we have, we looked at the combination of all
consultants and SIs as one group and the combination of all specialist vendors as another. So,
Capgemini or Accenture might be
represented in the system integrators and you can probably think of many others
and their friends at Exactpro and others would of course be
represented in the other.
Danger of what I'm presenting - I'm giving you an average of averages -
imagine everybody knows the challenge of that.
However, that said there is one there are a couple of interesting
points. In general, system integrators and
consultants do work across the whole with the system, the whole of the architecture and therefore they generate a certain
loyalty, but also I believe that's because they have more control of the entire process and more influence over the process and the thinking around
the process
Specialist vendors, you can see from our chart,
without any great need to go into detail,
seem to have less impact.
Of course,there are some who have great impact and some who have less. However,
they are prey to one very specific problem and one anecdote around this will will illustrate the problem.
There's a vendor that I imagine most of you in this room would know - it's not Exactpro,
who was retained about three years ago by a very familiar name,
a very familiar global bank and about a year or two later they were let go.
They still have a great dialogue and the specific reason was that the bank
concerned never got their proverbial together to be able to exploit and benefit from
the vendor.
The issues that were blocking getting good value out of the vendor were
endogenous
to the way that that Bank worked, whereas where you are a
broader based
consultant, of course, you may have more influence over that
overall process and have more places within the process that you can score points and be successful.
And in fact, if we look at the middle of the satisfaction ratings, we find, unsurprisingly, that the specialists are just as expert, and
one might argue in their very specific domains often more expert than the consultants.
This is a
2018 set of releases that we'll be doing on
the impact of the vendor community on the QA processes of the
client side.
A 2019 program, that I hope that we'll get to work with some of you on, is to actually come up with objective benchmarks
for the
client side, as well to benchmark the overall process and help identify
from an overview perspective the strengths and weaknesses of the way that QA is being implemented against the
organization and then being implemented as a discipline in itself.
So pulling that together:
The first
The first conclusion is really business 101: You need a business strategy,
you can't have technology strategy till you have a business strategy
and you can't have a viable testing strategy and testing priorities
until you have the technology priorities.
You need an enterprise-wide view, not purely a local view of so much that feeds into testing.
You need not just automation
but you need smart automation, which of course means AI
and adaptive learning and,
again, sing my own song sheet. You need more complete
benchmarks to measure whether the vendors you're using
and the processes you're using are really making the difference they're intended to make.
Thank you any questions any comments?
Thank you,Justyn!
Chart of
system integrators against vendors. There was no cost involved in that, I would suggest that
the cost of system integrators against vendors would be
a multiple of two or three times.
I think it's deceptive to put that out without any financial impact on that.
I think that that's fair actually, it definitely wasn't apparent in the way I presented, but there is a value for money component
embedded within that and you're absolutely right. The devil is in the detail.
Thank you!
So and one question from me: why is your project named The QA Vector 500?
Why why not 1,000 or 250 or ...?
Well, to to reveal one or two trade secrets we decided, when we started the project
just about a year ago now actually,
we had in our database about
220 specialist vendors in the space.
By
the end of last year we had about 380.
We have some we had when we decided to how to name it was 400, and we thought five hundred gave us latitude.
But it's invited five hundred is a number that we can get our heads around.
Right, thank you!
Any more questions to Justyn?
No, Okay. Thank you Justyn.
Thank you. Thank you very much!
-------------------------------------------
Aerospike CNoSQL Database Software Benefits from Intel® Optane DC Persistent Memory | Intel Software - Duration: 1:40.
Aerospike is a database company.
We serve companies trying to put together front-edge systems
that really delight customers.
The ability to store data in a database is one thing.
But to be able to retrieve it rapidly is the other portion.
Our customers require a database that
is capable of terabytes of storage
and submillisecond database accuracy.
With Intel Optane DC Persistent Memory,
we now have a technology that is capable of solving indexing
problems at a high level of density.
This allows the capability to build different indexing
strategies for databases.
This means that finding more specific data
is more efficient, thus allowing creation of finer-grained
and more powerful applications.
Our necessity at the front edge to provide
more personalized information to support machine learning
has really driven incredible increases of scale.
As big data volumes have gotten bigger,
our customers need terabytes of indexes.
Being able to provide a fast index
database with persistent memory allows big data technologies
and machine learning to become practical.
We've been working with Intel for nearly two years,
looking at the different optimizations.
With Intel Octane DC Persistent Memory,
Aerospike is able to provide great benefits to customers,
index more data, provide cheaper and more powerful
database technology.
The capabilities of this technology
are practically endless.
[MUSIC PLAYING]
-------------------------------------------
Suite Archimede HR - Software per il workforce management nell'industria 4.0 - Duration: 1:17.
For more infomation >> Suite Archimede HR - Software per il workforce management nell'industria 4.0 - Duration: 1:17. -------------------------------------------
Why has Minster Grange been using Person Centred Software? - Duration: 2:12.
For more infomation >> Why has Minster Grange been using Person Centred Software? - Duration: 2:12. -------------------------------------------
Why Redis* Labs Embraces Intel® Optane™ DC Persistent Memory | Intel Software - Duration: 1:38.
Redis Labs is the company behind the popular open source
Redis, an in-memory NoSQL database downloaded
over a billion times.
We have over 8,000 enterprise customers.
In-memory databases tend to be expensive because RAM
is more expensive than disk.
The beauty of persistent memory is
that it allows you to extend use cases for things that before,
you couldn't have put in in-memory database
because the price and the cost associated
with managing this infrastructure
was very, very high.
The fact that Redis can utilize this new technology
and provide in-memory experience at a fraction of the cost, this
is the main benefit for our customers.
Everyone is looking for an instant response time.
Instant response time means less than 100
millisecond from the time you click
the button until you get a response from your application.
With Intel Optane DC persistent memory,
you can now practically put your entire data
on an in-memory database and enjoy
this sub-millisecond latency for your application performance.
Now we have a way to extend the RAM with something
that runs at approximately the same performance,
but much more cheaper to use.
This is why persistent memory's very important for us
and for our customers.
We believe that next generation server architecture will
be all with persistent memory.
This is going to change the entire database market.
[MUSIC PLAYING]
Không có nhận xét nào:
Đăng nhận xét