Today at WWDC, Apple brought machine learning to Photos to help you find, discover and share your images in a more intuitive way than ever before. The features borrow some of the best features from Google Photos, like re-surfacing memorable events, creating albums based on events, people and places, and using deep learning to help find images in a more intuitive way. The new algorithm uses advanced computer vision, a group of deep learning techniques that brings facial recognition to the iPhone. Now, you can find all of the most important people, places and things in your life in with…

This story continues at The Next Web