Thứ Tư, 30 tháng 5, 2018

Auto news on Youtube May 30 2018

This is AI:UX, a mini series focused on 10 guidelines that

were created for all those that are involved in the design

and development of AI based systems.

I am Daria Loi.

And today, I talk about guideline number seven.

Make people feel unique.

And empower them to achieve their unique goals.

Part of human nature is the need to feel special

and to be acknowledged as an individual.

During many interviews, people describe their need

to feel unique throughout their technological interactions.

Feeling too predictable makes them feel uncomfortable.

Not only do people want to feel unique

when they interact with AI, but they

want AI to empower them to achieve their goals,

especially if those goals seems out

of reach too burdensome or complex.

My research shows that AI usages that make

people feel connected with each other

as well as cared for make them feel acknowledged, unique,

stronger, and empowered to achieve personal goals that

are meaningful to them.

It is well known in literature that social connectedness has

a positive impact on people's wellness

and ability to proactively engage

with the world around them.

Knowing that what you need, want, and do matters,

and makes a difference.

Aligned with that, AI systems that focus on companionship,

social connectedness, and mediated social interactions

all offer great design and development opportunities.

This means focusing on applications that connect

instead of isolating its users.

It means offering systems that focus

on those that have less access to social connectedness.

Imagine the opportunities of designing smart home systems

to provide social connectedness to senior citizens.

It also means developing services and systems

that focus on true personalizations,

catering to the specific needs of a specific individuals,

versus offering one size fits all solutions that

were designed for a typology of users instead of a real person.

Imagine designing applications that leverage voice

based systems to offer companionship

to those that, due to reduced mobility,

have less opportunities to directly interact

with the world around them.

Imagine services leveraging audiovisual equipment

to enable someone to have enriched

social interactions, despite physical or economical

limitations.

Imagine an intelligent system that

remembers what you want to achieve,

and that contextually prompts and encourages you,

suggesting the best way forward, given time, situations,

and even the emotional state you're in.

Thanks for watching.

Don't forget to like this video and subscribe.

I will see you next week on Tuesday for more AI:UX.

For more infomation >> AI UX | Empower People to Achieve Unique Goals | Intel Software - Duration: 3:02.

-------------------------------------------

KRONE visits SCS Software - Euro Truck Simulator 2 (EN) | KRONE TV - Duration: 3:43.

Hello everyone and welcome to SCS Software.

Today, we will do a small walk-around at our office

and we will look what's going on behind the scenes of our work.

Most especially, we will take a look at the current face of the project KRONE DLC

and what's happening with the photos we took at KRONE.

So, let's go.

This is the department, where everything you can see in our games is being made.

That means all traffic signs, all buildings, all the road looks and everything else.

So, this is the department of our map designers

and that means that our in-game worlds are being created here.

As you can see

my colleagues are constantly working on how to improve the little details

and the realism of our game.

So, this is our progamming department

It is here, where all the magic is happening.

As you can see, our programmers here are working on things like

the physics of the trucks, behaviour of the AI or the police cars

weather, lightning effects, shadows and nearly everything you can see in our game.

So, this is our audio room

where our AFX guy is creating

all the sounds you can here in our games

like for example

the sounds of the engine, sounds of the brakes or sounds of the AI

that are moving around

when you are approaching in the game.

This is our special room, where as you can see

our testers are trying all the suspensions and physics of the trucks

which are implemented in our game.

So, this is the department of the vehicle artists

and Hynek, he will tell you more about the current progress on the KRONE DLC.

Hi, welcome to the room of the vehicle department.

Here, you can see the work already started on the KRONE Trailers.

Thanks to the pictures, we were able to capture at KRONE

we are right now able to have the most-detailed references for our trailers.

So, with the pictures, we have taken

we put them in a photogrammetry software

which helps our artists to actually reconstruct the trailer

in the best detail possible.

What is actually done in photogrammetry for us

you can from the pictures reconstruct the 3D module

but it's too complex for the game.

So, we generally just used it to align the cameras

and the artists then go over the 3D-constructed model

and the aligned camers and the aligned pictures

they are able to create the trailer as accurate as possible.

So, that was a small look behind the scenes of SCS Software

and how things are being done here in our office.

We will continue working on the KRONE DLC

and you stay tuned

because we are also planning something really special in cooperation with KRONE.

See you.

For more infomation >> KRONE visits SCS Software - Euro Truck Simulator 2 (EN) | KRONE TV - Duration: 3:43.

-------------------------------------------

KRONE zu Besuch bei SCS Software - Euro Truck Simulator 2 (DE) | KRONE TV - Duration: 3:27.

For more infomation >> KRONE zu Besuch bei SCS Software - Euro Truck Simulator 2 (DE) | KRONE TV - Duration: 3:27.

-------------------------------------------

Intel® AI Interplanetary Challenge Week 2 | AI & Asteroid Detection | Intel Software - Duration: 3:09.

Hello, fellow space and science fans.

I'm Bill Nye, CEO of the Planetary Society.

And I'm Robert Picardo.

I'm not a doctor or AI, but I've played both on TV.

So Robert, let's talk about asteroids.

Bill, I thought you'd never ask.

Our solar system is full of stuff.

It's not just planets and moons, but there's rocks, dust,

asteroids, and comets.

Yes, there are tons of these objects in our solar system.

But space is vast and it is highly unlikely,

on any given day, that a large object would strike earth.

But it is possible, and it has happened in the past.

Oh, yes.

So small objects hit the earth all the time,

and they just burn up in our atmosphere.

It's beautiful.

In fact, you may have made a wish on one

some time in the past.

But you probably wouldn't want to make

a wish on one of the big ones.

Well, you know, I wish that wasn't about to hit us.

That might have been what the ancient dinosaurs were

thinking.

I mean, it's hard to say for sure.

But I do know that another large collision in the near future

is unlikely.

But it's hardly impossible.

That is why space agencies around the world

keep an eye out for asteroids that come near our planet.

If an asteroid is on a collision course with us,

these agencies can try to nudge the asteroid off its course

with a variety of different methods.

Or shatter it with big space explosions.

Or they could even send a robotic probe

to tractor it off course.

Number one, engage tractor beam.

I'm kind of good at that.

You might be wondering, what does

all this have to do with artificial intelligence?

We humans can train machines not to replace the scientists,

but to help them go through the mountains of data

coming in from all of our observations.

Machine learning can help scientists flag potentially

hazardous asteroids, making the process of identifying

and categorizing these objects far more accurate and far less

expensive.

AI and machine learning will help our planetary defenders

be more efficient and accurate.

I love it.

So if you want to learn more about AI in space,

the Planetary Society always has exciting projects in the works.

Take it from me.

And the Intel AI Academy is where

you can find learning materials, get access to compute space,

and connect with a community of AI students and experts.

That's it for now.

Come back next week.

And until next time, keep looking up.

Especially if you're an asteroid hunter.

Do you see anything, Bill?

I don't know, but keep looking.

For more infomation >> Intel® AI Interplanetary Challenge Week 2 | AI & Asteroid Detection | Intel Software - Duration: 3:09.

-------------------------------------------

F-35: LE INTEGRAZIONI SOFTWARE AL 2023 - Duration: 5:53.

For more infomation >> F-35: LE INTEGRAZIONI SOFTWARE AL 2023 - Duration: 5:53.

-------------------------------------------

F-35: SOFTWARE 2B, I MARINE LO FARANNO VOLARE IN COPPIA - Duration: 9:02.

For more infomation >> F-35: SOFTWARE 2B, I MARINE LO FARANNO VOLARE IN COPPIA - Duration: 9:02.

-------------------------------------------

software developer - Duration: 1:30.

For more infomation >> software developer - Duration: 1:30.

-------------------------------------------

WebVR | Introducing VR Concepts | Intel Software - Duration: 2:16.

In this episode, we'll talk about values VR concepts.

I'm Alexis Menard, and I will introduce the basics to create

the illusions of reality.

This is WebVR.

[MUSIC PLAYING]

To create an immersive, yet performant VR experience,

you need to understand how VR works.

Virtual reality experiences start with simulating the world

in front of your eyes.

It's done by using a stereoscopic display in an HMD.

A VR headset is built with two lenses together with one

or two displays inside.

In each display, we show slightly different angles

of the scene to each eye, simulating depth.

This means that a VR application has to render two frames--

one for each eye with slightly different parameters.

These parameters, called user pose,

are calculated using the sensors that

are located inside the HMD.

If your system supports room-scale tracking,

you will also get the position in the pose data.

The final visual effect is applied

with the lenses, which create a stereoscopic 3D

image by angling the two 2D images to mimic

each of your eyes.

[INAUDIBLE] lenses is the distortion

they apply on the content.

Think of a magnifying glass.

The middle content is sharp, but the edges are distorted.

A VR runtime has to take into account this problem

and apply post-processing on each frame.

The VR system has to neutralize the distortion

by applying the opposite effect of the lenses.

This is called barrel distortion.

If you have ever wondered why VR imaging looked

distorted on your smartphone before you put them

in your HMD, this is the reason.

The final critical part of VR is latency.

If the user perceives dropped frame or slowness

in the refreshing the content, it will most likely

lead to motion sickness and provide a poor user experience.

If a VR application doesn't meet its frame budget-- for example,

11 milliseconds in a 90 hertz VR display--

the VR runtime will trigger what we call asynchronous timewarp

or reprojection, basically showing

the last [? run ?] dropped frame reprojected with the latest

user pose.

Timewarp has drawbacks such as jitter.

And as a developer, you should always

try to meet your frame budget.

Thanks for watching and subscribing

to the Intel Software channel.

We'll see you next week for another episode of WebVR.

[INTEL THEME]

Không có nhận xét nào:

Đăng nhận xét