Comments Locked

33 Comments

Back to Article

  • PeachNCream - Wednesday, December 11, 2019 - link

    This seems like something Google would love to sell you in order to measure your bust size and then use it as the basis for spamming you with enhancement or reduction adverts after using it to violate your privacy on an on-going basis while suckering you into thinking you're getting some actual benefit from using it.
  • Eliadbu - Wednesday, December 11, 2019 - link

    In other words: mining data about you and your apartment than using it to for advertising or even worse selling it to 3rd party.
  • Amandtec - Wednesday, December 11, 2019 - link

    Google don't sell your data. You are thinking of Facebook.
  • GreenReaper - Wednesday, December 11, 2019 - link

    No, they sell access to match against it. So they effectively *rent* your data - to the highest bidder.
  • dullard - Wednesday, December 11, 2019 - link

    You aren't that far off: https://www.intelrealsense.com/naked-home-body-sca...
  • mode_13h - Thursday, December 12, 2019 - link

    That's surprising. I didn't want to click your link, so I searched it out and it appears to be legit.
  • mode_13h - Thursday, December 12, 2019 - link

    It's funny you say that, because Google actually had a phone platform that included a depth sensor, which they killed about 1.5 years ago.

    So, no. Google clearly does not care about accurate depth maps. If they want to estimate your bust size, they'll just have their deep learning algorithms look you over with your RGB camera.
  • Ikefu - Wednesday, December 11, 2019 - link

    I wonder how this would do as a replacement for Xbox Kinect in hobby and academic projects now that the consumer grade Kinect is defunct.
  • mode_13h - Thursday, December 12, 2019 - link

    Not defunct: https://www.microsoft.com/en-us/p/azure-kinect-dk/...

    The sample images I've seen from it look very impressive. Plus, it supports sensor teaming for volumetric captures.
  • imaheadcase - Wednesday, December 11, 2019 - link

    For a home user, what would this be practical for?
  • qap - Wednesday, December 11, 2019 - link

    lidar are used in robotic vacuum cleaners. Usually they produce only 2D map. This has limitation in what they can recognize. This could greatly improve avoiding areas that it can get stuck on (my gets stuck on clothes drying rack), it can avoid cleaning dog feces (needs object recognition) and other hazadrous areas (it can replace cliff sensor which have problems with black carpets).
    Obviously not for 350USD. When it get's down to 35USD, we can talk.
  • imaheadcase - Wednesday, December 11, 2019 - link

    So nothing for average home user.
  • Morawka - Wednesday, December 11, 2019 - link

    This would be useful as a 3D scanner when combined with a lazy susan. I can also see this being used for interior decorating planning.
  • michael2k - Wednesday, December 11, 2019 - link

    You're weirdly obtuse. That's like asking, regarding a new CPU/GPU product announcement, "For a home user, what would this be practical for?"

    After all, the average home user has no need to add or multiple billions of numbers per second, which is what a CPU or GPU fundamentally does.

    Nor, if we talk about SSDs, do they need to write, read, and erase 1s and 0s billions of times per second either, and that's all an SSD does too.

    But if you talk about the software that uses the CPU, GPU, or SSD, well, then you have the ability to start doing home videos, playing games, writing reports, etc.

    So the same with a small, compact, cheap lidar unit. Assume it continues to get miniaturized until it is part of your phone, the same way a camera is. Did you know cameras used to be the size of a shoebox? Then the size of a deck of cards? Now cameras have shrunk to the size of a golf ball, or smaller.

    So if we imagine this same 3D scanner and it's use cases:
    1) Advanced home security detection of people, animals, and vehicles, as well as robots
    2) Robots will use this to perform collision avoidance as well as navigating spaces
    3) Object detection, when a robot needs to identify and select between things (think of a robot that can see the difference between an apple and an orange by color as well as shape
    4) Video games and VR, without needing extra sensors
    5) Projectors with advanced 3D sense can actually manipulate the image prior to projection to account for surface irregularities; or, anything will become a screen
    6) AR with 3D data can turn anything into a virtual screen, too
    7) Refrigerators can scan it's contents to identify food as it wilts or decomposes, drys out, or melt with much higher fidelity than a camera can using only pixels

    I mean, this is pretty cool.
  • name99 - Wednesday, December 11, 2019 - link

    That’s a little harsh.
    The alternative to LIDAR is using smarts to analyze a visual scene. That relies on a more expensive chip, sure, but a cheaper sensor. And at least some people (Tesla...) believe vision is a “better” solution than LIDAR.

    So the issue is not just “what would I do with 3D vision”, it’s also “do I need to pay $350 to get 3D rather than using a smart chip/software?”
  • mode_13h - Thursday, December 12, 2019 - link

    I wonder how many autopilot-related deaths & lawsuits it'll take to disabuse Tesla of the absurd notion that you can afford to use vision, alone.
  • close - Wednesday, December 11, 2019 - link

    @michael2k, you're talking about future possible applications of LIDAR. I think the question was if you can do anything practical today as a regular with this particular $350 piece of equipment.
  • khanikun - Thursday, December 12, 2019 - link

    Everything on his list can be done today, so long as whoever has it knows how to program for it. If you're talking, what can some computer illiterate person use it for...well nothing.
  • close - Thursday, December 12, 2019 - link

    "what can some computer illiterate person"

    Yes, that's basically exactly what OP was asking. Basically if it exists "someone" will be able to do something practical with it (even if it's just porn). When a regular user sees this Intel puck they may think "maybe I can plug it in, install the bundled software, and Windows will scan my face and log me in" or something like that.
  • mode_13h - Thursday, December 12, 2019 - link

    Probably not "average" home users, no. It's not marketed towards them, either.

    This thing is mainly for researchers, tinkerers, and people prototyping & developing products that would integrate its sensor(s).
  • Amandtec - Wednesday, December 11, 2019 - link

    Self Driving Cars made of Lego
  • Amandtec - Wednesday, December 11, 2019 - link

    Indoor != Home.
  • mode_13h - Thursday, December 12, 2019 - link

    Robotics & 3D scanning are two obvious uses. Perhaps virtual set, for video production.

    Here are some more examples: https://www.intelrealsense.com/use-cases/
  • Thud2 - Wednesday, December 11, 2019 - link

    Sorry if these are stupid questions but I know from years of reading AT that there amny, many more knowledgable people here than I. Can this camera stream h264? That image is much more defined that thermal cameras 10 times the price can stream. Thermal cameras can struggle to define humans at ambient temps around 98.6. Optical security cameras are confused by fog, shadows close insects. This seems to be an ideal security camera (aside from the USB bandwidth issue). Can this operate through glass or plastic for an enclosure? One more question :) Could this camera be programmed to only pass images further away than an insect at 1 or 2 feet. Thanks in advance. End of novel.
  • mode_13h - Thursday, December 12, 2019 - link

    It utilizes a class 1 laser @ 860 nm. So, if whatever you want to shine it through is transparent at that wavelength, you're good.

    Beware that it will likely have difficulty with glossy or reflective surfaces, though. So, it won't be very good for detecting people wearing tinfoil hats.
    ;)

    They don't say what it can stream. Its connector is "USB-C 3.1 Gen 1". You might just get a raw stream and have to encode it, yourself. Given that the RGB is limited to 2 MP @ 30 Hz and depth is limited to 0.7 MP @ 30 Hz, I think the USB interace won't be a bottleneck.
  • Thud2 - Thursday, December 12, 2019 - link

    I really appreciate the responses. Disappointing to see it's a "simulated" DOF image in the release. Will have to keep an eye out for research.
  • mode_13h - Saturday, December 14, 2019 - link

    There's a PDF datasheet linked from their product page. It says the RGB image formats are YUY2, which is a raw, YUV format.

    Incidentally, the depth measurements seem to be 16-bit unsigned ints.
  • edzieba - Thursday, December 12, 2019 - link

    If that's a 'legit' captured depth map (rather than fusion of multiple captures over time into a synthesised scene) then holy crap, that's an extremely clean sensor output for that price. I might have to grab one to play with.
  • stepz - Thursday, December 12, 2019 - link

    It's not: "Simulated example of depth output from device"
  • Thud2 - Thursday, December 12, 2019 - link

    I just watched the release video and it says the lidar is XGA. Better than cheap Thermal IP cameras but not like the image in the release at all
  • mode_13h - Saturday, December 14, 2019 - link

    I was wondering what the heck is XGA, but I guess it's just referring to the resolution (1024x768).
  • mode_13h - Saturday, December 14, 2019 - link

    They claim to have two, actual depth images on this page:

    https://www.intelrealsense.com/lidar-camera-l515/
  • edzieba - Monday, December 16, 2019 - link

    Heavy on the edge-detect, but still rather good. I wonder if the MEMS LIDAR chip is from Draper or Aeva, or a surprise Intel-developed chip?

Log in

Don't have an account? Sign up now