[file:///data/user/0/com]*—*16.06.2020 15:20*Schneier on Security
New research*is able to recover sound waves in a room by observing minute changes in the room's light bulbs. This technique works from a distance, even from a building across the street through a window.
Details:
In an experiment using three different telescopes with different lens diameters from a distance of 25 meters (a little over 82 feet) the researchers were successfully able to capture sound being played in a remote room, including The Beatles'*Let It Be, which was distinguishable enough for Shazam to recognize it, and a speech from President Trump that Google's speech recognition API could successfully transcribe. With more powerful telescopes and a more sensitive analog-to-digital converter, the researchers believe the eavesdropping distances could be even greater.
It's not expensive: less than $1,000 worth of equipment is required. And unlike other techniques like bouncing a laser off the window and measuring the vibrations, it's completely passive.
https://www.schneier.com/blog/archives/2020/06/eavesdropping_o_9.html
Yeah I read this article. It's very interesting technique but has needs a very specific condition.
Related
For those who doesn't know "My Tracks" allows you to create tracks with the GPS, showing various info - average speed, average moving speed, height differences, etc.
In the last several weeks I found it useful to track my snowboarding sessions - where I am riding, how fast, etc. The problem is that there is some really strange problem - in 8 out of 10 cases after I start recording a new track it turns out I have a top speed of 460km/h. I ride really fast, yeah, but that fast?
I was wondering what could be the case? For obvious reasons I can't observe the program while riding - one fall and the phone is gone
When I am on the lift I can observe it and everything is normal. I had the suspicion that falls could cause this, but today I had several quiet bad falls, during one recording and yet i t was accurate.
Has anyone observed such behavior?
Try CardioTrainer, I like it better then MyTracks
Hi, I have a very bad GPS accuracy. It is worse when I am in the car or bus, then accuracy is twice worse. My track looks like a zigzak even when I am going just straight. My other and older phone do it much better. Is there any antenna issue or this gps is just sh*t?
Also the compass is decalibrated all the time. On google maps it shows just wide spectrum instead of narrow field like my old phone, so almost 180 degree spread. Also not very accurate. For car nav it is ok accuracy, not jumps much, but when walking or recording a trace it is a big problem and for example for running app it will add distance and when sum it up it will be significant. Is my unit just bad or what? Can it be improved? It is hardware or software related thing?
When I am on other phone, the accuracy once dialed in it will stay like this, here it is fluctuating from good to worse and even sometimes very bad (like 100m).
You're not alone. I'm experiencing poor gps accuracy as well, but it's hard to tell whether it's software or hardware related.
I barely get less than 10m of accuracy when I test the gps using the "gps test" app, with clear view of the sky, with 15~ satellites in use.
Same here
Actually GPS and Compass are pretty accurate compared to my Nexus 6. check out screenshot, it was taken at my living room in a 2 story house, weather condition is cloudy and slight rain.
And what happens outside? Does it manage to retain high accuracy (up to 5 meters/16 feet)?
I'm getting quick fixes, high accuracy (15'), but apps like locus and ingress are very drifty. It's weird.
Haven't bought a Phillips hue yet, but planning on changing my whole lighting system with Phillips hue. Now I own a Google Home mini, and a Lenovo Smart Display with Google Assistant. Both are connected, but will I be able to turn on the lights with both assistant? I use the Lenovo Smart Display as like a main console in my living room, because the screen just gives so much more features than the home I have. The camera shutter feature is also life-changing. Makes me feel more safe and less paranoid lol. Both are connected, I was wondering if it can connect with both assistant? Has anyone tried?
both are using google assistant which is mainly web based so both should work with your hue lights once they have been paired i have hue with my google home mini and they work like a charm
I'd actually recommend against doing whole house with Hue bulbs. I have them, and after doing a whole house conversion I think that getting smart switches is a much better way to go.
edembowski said:
I'd actually recommend against doing whole house with Hue bulbs. I have them, and after doing a whole house conversion I think that getting smart switches is a much better way to go.
Click to expand...
Click to collapse
I see applications for both and I do have both. I splurged on a Leviton Z-wave dimmer switch for a couple of lights cause the Hue bulb price for that type of light was a little too rich for my blood. However, the individual bulbs do give me more granular control. For instance, I got a Hue motion sensor and hooked it up to two different lights in two different rooms so they act as night lights. There are uses for both, think it just depends on what you want and where the lights are imo.
When it comes to smartphone photography, the most challenging shots are always going to be night shots. Situations with limited light most often result is grainy unusable photos for devices with weaker cameras. The Kirin 970’s AI chip helps to solve this issue with “Handheld Super Night Mode”.
One way to achieve better night shots is to set your phone on a tripod and let your camera use a longer exposure and higher ISO. This is a bit inconvenient as most people obviously wont be walking around with tripods. To solve this issue, Honor uses the Kirin 970 to add “Handheld Super Night Mode” to their phones. This mode lets you take better night shots without having to setup any equipment.
Handheld Super Night Mode works by using powerful AI algorithms, and the quick processing ability of its Kirin 970. There are several techniques used to enhance your night time photos.
AI Detection of Handheld State
One of the key factors of Handheld Super Night Mode is how the phone uses the AI chipset to detect any hand-held jitter of the phone. To realize accurate and efficient detection, the AI system collected and analyzed tens of thousands of data records reflecting different types of photographers and their camera and tripod usage methods, designing a machine learning logic to understand their habits. As a result of implementing this massive amount of data, the Kirin 970 is able to detect when Handheld super night mode is needed in 0.2 seconds. Using this data, the average users is now able to take better night shots without having to use a tripod.
AI Photometric Measurement
The AI photometric measurement system controls the camera’s light intake. After you tap the shutter button, The AI will automatically set the exposure and number of frames based on the lighting scenario, brightness of the preview image, distribution of light sources, and jitter.
AI Image Stabilization
After all of your frames are captured from your night shot, they are merged into a single image. It is common that surring this process, night shots often turn out blurry. To avoid this, before the synthesizing process takes place, the AI the clearest frames and discards any of the bad ones. The clearest frames are used as the standard for the image, while the other frames that the AI has not discarded are automatically aligned. The AI-powered Kirin 970 chip detects feature points within each frame, matching these points and aligning them to to produce the cleanest image possible.
Image synthesis
The final step in Super Night Mode is image synthesis. For this step, customized algorithms have been computed for the AI system to increase the number of short-exposure frames in bright areas to avoid overexposure and the number of long-exposure frames in dark areas to improve detail retention. Frame differences are detected pixel by pixel. If differences are large, AI determines that alignment failed around the edges and conducts correction and repair to ensure the edge regions are still crisp and sharp enough after synthesis. Noise reduction is performed on multiple frames, thereby improving the image’s signal-to-noise ratio, and achieving a clearer, cleaner, and brighter night shot.
Check out photo samples using walking night mode here.
May be Ai helps but in my experience AI mode decrease the image quality, sharpness and boost the colours too much.
Most of images captured in AI mode are not useable.
Thanks for this usefull info sir. Very well explained.
Hey guys
First of all, sorry I have to use a translator.
2 questions:
1, still has who that problem that the speaker (ear) scratches? I now have the third phone where this occurs and at some point all the participants in the conversation are scratchy audible. Little info about this, even all 3 phones have the problem that the USB port wobbles at some point.
To the actual,
Topic-SUPERPIXEL
So what about 7 +? ICh had followed the performances, live and and, it was mentioned and presented extra big.
The thing is now the,
A. one does not notice in any preset about this, either in 16MP or in 4Mp, apart from that only the 4MP is likely to be a superpixel image, but it would also have to have the same Aspect ratio, namely 16MP 4:3 to 4MP 4:3, but there are only 4MP 16:9.
When you talk to Nokia about it (and I have that often and long), there are the most curious answers.
-SUPERPIXEL does not exist
-SUPERPIXEL has never been announced or unveiled
-SUPERPIXEL is not going because the SENSOR does not have super-large pixels
-16MP is the superpixel image (ok, 4 becomes one, so we have a sensor that in the original offers four times 16MP? Good joke.
-4MP is correct and yes it must have the same ASPECT ratio, which it has (no that is not true and you don't see any of it either)
Even to the question why all other apps can only recognize and use 4MP, Nokia does not give an answer, even if one points out the false SUPERPIXEL answers,
And I also mean, no answer, means that as soon as one draws attention to the wrong information the conversations are canceled or ignored.
Happens via email, chat, facebook chat.
Even the contacts who were responsible for more intensive conversations and long conversations simply no longer answer, emails are answered (manually) with the statement one is not domestic, knows nothing or cannot answer.
In the Facebook Chat where you tried really for a long time and also realized that you get conflicting information with deeper internal requests just doesn't answer anymore.
There is massively something lazy, even if it only sounded funny, with these 4MP, this 16MP and that Superpixel is missing is true what is not.
Since the 16MP looks even worse visually (zoom) than with a third provider camera app in 4MP, there is a real suspicion that the 4MP will be scaled up to 16MP (which of course does not make the picture better) instead of taking from 16MP to 4 pixels each to one (What PureView or Superpixel means).
I think slowly there is really something lazy.
I've had this problem too.
--still has who that problem that the speaker (ear) scratches? I now have the third phone where this occurs and at some point all the participants in the conversation are scratchy audible. Little info about this, even all 3 phones have the problem that the USB port wobbles at some point.
For the phone mic being scratchy, I have a small air hand pump (bicycle pump) which I use to clean out the mic holes (2 that I can see. I heard there's 3 but I don't know where the 3rd one is). Hope this helps.