Artificial intelligence could help data centers run far more efficiently

August 21, 2019

MIT system “learns” how to optimally allocate workloads across thousands of servers to cut costs, save energy. Read more

Source: MIT Research News

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Leave a Reply

Your email address will not be published. Required fields are marked *