Thứ Năm, 27 tháng 12, 2018

Auto news on Youtube Dec 27 2018

So, we have implemented SysAid in North York General Hospital for about five years.

And, yes, in the beginning, we only have that in the IT. But other departments

started looking at our solution because our KPIs are so great and every time

I try to present our KPIs, the data is great, the information is great.

So, the other departments including facility management,

including biomed management, and the other departments,

they are approaching our CIO saying, "Okay, can we also use SysAid

as well?" So, yeah, so far we already implemented in other departments as well.

♪ [music] ♪

For more infomation >> How SysAid Service Desk Software Transforms Non-IT Departments - Duration: 0:45.

-------------------------------------------

Tutorial VPN Windows 10 untuk membuka situ diblokir Pemerintah tanpa Software dengan IP negara lain - Duration: 8:22.

For more infomation >> Tutorial VPN Windows 10 untuk membuka situ diblokir Pemerintah tanpa Software dengan IP negara lain - Duration: 8:22.

-------------------------------------------

Knowledge Distillation with Keras* | AI News | Intel Software - Duration: 1:21.

[MUSIC PLAYING]

I'm David Shaw.

In this episode of AI News we look

at the concept of knowledge distillation and the ways

it can improve deep learning model performance

on mobile devices.

Knowledge distillation is a process

where a large and complex network

is trained while extracting important features

from the given data.

It can produce better predictions.

Let's suppose we train a small network

with the help of a complex model.

This network will be able to produce comparable results

and in some cases, even be capable of replicating

the results of a more cumbersome network.

For example, GoogleNet is complex.

Its deepness gives the ability to extract features.

And it has the power to remain accurate but at a cost.

The model is heavy and needs huge amounts

of memory and a powerful GPU to perform large calculations.

That's why we need to transfer the knowledge learned

by this model to a smaller model that can be easily used

on a mobile device.

Read this article to learn more about how bulky models can

be used to create lighter models by using knowledge

distillation.

Thanks for watching.

Don't forget to view the links and learn more.

And I will see you next week.

[MUSIC PLAYING]

Không có nhận xét nào:

Đăng nhận xét