How Does How-Old.Net Work?

Back to the main topic, I want to answer this question in two parts. First part I will talk about how to quickly implement the exact same capabilities in any app. In the second part, I will go a bit deeper to describe the technology itself.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

How does how-old.net work?: originally appeared on Quora: The best answer to any question. Ask a question, get a great answer. Learn from experts and access insider knowledge. You can follow Quora on Twitter, Facebook, and Google+.

2015-05-12-1431397897-3749469-EasonWang

Answer by Eason Wang, Senior Program Manager on Bing

I directly worked on this project. To be honest, it's a big surprise to me that this tiny web app went viral. I did some post analysis on why it went viral and wrote a blog post on Medium.

Back to the main topic, I want to answer this question in two parts. First part I will talk about how to quickly implement the exact same capabilities in any app. In the second part, I will go a bit deeper to describe the technology itself.

In Bing Image Search we have built the best in industry image understanding capabilities in the past few years in collaboration with Microsoft Research. It was used in Bing and quickly expanding to other Microsoft products. Now it is open to all developers: Microsoft Project Oxford Home. In order to implement the same capability in an app, you can simply call the web API and get all the necessary information back in JSON format. You can give it a try by uploading an image here. It gives the data back in seconds. The face coordinates, gender, age information are all included. Face API is just one of the many features that we have made open in Project Oxford. There are many other core capabilities in the API to empower innovative scenarios. I am very excited to see this Microsoft internal API open to all developers and I know this will have profound impact to the developer world because the previous impossible scenarios are now just one simple web API call. #HowOldRobot was just one tiny demo to show off these capabilities. It was put together by one developer from Azure ML team just in one day.

  1. JSON:
  2. [
  3. {
  4. "faceId": "5af35e84-ec20-4897-9795-8b3d4512a1f9",
  5. "faceRectangle": {
  6. "width": 60,
  7. "height": 60,
  8. "left": 276,
  9. "top": 43
  10. },
  11. "faceLandmarks": {
  12. "pupilLeft": {
  13. "x": "295.1",
  14. "y": "56.8"
  15. },
  16. "pupilRight": {
  17. "x": "317.9",
  18. "y": "59.6"
  19. },
  20. "noseTip": {
  21. "x": "311.6",
  22. "y": "74.7"
  23. },
  24. "mouthLeft": {
  25. "x": "291.0",
  26. "y": "86.3"
  27. },
  28. "mouthRight": {
  29. "x": "311.6",
  30. "y": "88.6"
  31. },
  32. "eyebrowLeftOuter": {
  33. "x": "281.6",
  34. "y": "50.1"
  35. },
  36. "eyebrowLeftInner": {
  37. "x": "304.2",
  38. "y": "51.6"
  39. },
  40. "eyeLeftOuter": {
  41. "x": "289.1",
  42. "y": "57.1"
  43. },
  44. "eyeLeftTop": {
  45. "x": "294.0",
  46. "y": "54.5"
  47. },
  48. "eyeLeftBottom": {
  49. "x": "293.0",
  50. "y": "61.0"
  51. },
  52. "eyeLeftInner": {
  53. "x": "297.8",
  54. "y": "58.7"
  55. },
  56. "eyebrowRightInner": {
  57. "x": "316.0",
  58. "y": "54.2"
  59. },
  60. "eyebrowRightOuter": {
  61. "x": "324.7",
  62. "y": "54.2"
  63. },
  64. "eyeRightInner": {
  65. "x": "312.9",
  66. "y": "60.9"
  67. },
  68. "eyeRightTop": {
  69. "x": "317.8",
  70. "y": "57.7"
  71. },
  72. "eyeRightBottom": {
  73. "x": "317.9",
  74. "y": "63.7"
  75. },
  76. "eyeRightOuter": {
  77. "x": "322.8",
  78. "y": "60.8"
  79. },
  80. "noseRootLeft": {
  81. "x": "304.0",
  82. "y": "60.2"
  83. },
  84. "noseRootRight": {
  85. "x": "312.2",
  86. "y": "61.2"
  87. },
  88. "noseLeftAlarTop": {
  89. "x": "302.6",
  90. "y": "70.2"
  91. },
  92. "noseRightAlarTop": {
  93. "x": "313.0",
  94. "y": "70.0"
  95. },
  96. "noseLeftAlarOutTip": {
  97. "x": "298.8",
  98. "y": "76.2"
  99. },
  100. "noseRightAlarOutTip": {
  101. "x": "315.2",
  102. "y": "76.6"
  103. },
  104. "upperLipTop": {
  105. "x": "307.3",
  106. "y": "84.0"
  107. },
  108. "upperLipBottom": {
  109. "x": "306.6",
  110. "y": "86.4"
  111. },
  112. "underLipTop": {
  113. "x": "305.5",
  114. "y": "89.6"
  115. },
  116. "underLipBottom": {
  117. "x": "304.1",
  118. "y": "94.0"
  119. }
  120. },
  121. "attributes": {
  122. "age": 24,
  123. "gender": "female",
  124. "headPose": {
  125. "roll": "4.0",
  126. "yaw": "31.3",
  127. "pitch": "0.0"
  128. }
  129. }
  130. }
  131. ]

How Old Do I Look? mainly relies on 3 key technologies (i.e. face detection, gender classification and age detection). Face detection is the foundation for the other two. For age detection and gender detection, they are just classic regression and classification problems in machine learning. It involves facial feature representation, collecting training data, building regression/classification models and model optimization. There are plenty of publications in this area. Let me know if you have enough interest to go deeper.

On the other hand, deep learning and large scale data understanding have led to a new breakthrough of image understanding. This opens a door to more intelligent systems and APIs. You can check out my latest blog to understand how the Image Graph works to power advanced scenarios.

More questions on Quora:

Popular in the Community

Close

What's Hot