Related
Hi, I've search many place and unable to find any good guides for video conversion for the Acer liquid(Donut). And the User Manual don't even give full specs.
I've done many conversion of my own, however the video always lag...
~~ Please tell what is the max bitrate, best codec(mp4, h.263, etc), resolution, fps to make a video play hi quality NO frame skip/lag on the Acer Liquid(Donut)~~
Note: I understand the max quality is only as same as the video/clip you've inputed, and that the Acer only support up to 20 frame per section(at least that's what written in the user manual), and I have a good video converter btw, been using it for years...
Shalala
Alrite after playing with the video for a few more day.
I STILL don't get exactly what I wanted...and I notice alots of people viewing this thread, so I'll share what I have at the moment.
~~~Good Enough~~~
Right this moment, my best conversion is...
Video Quality Superb
File Format *.MP4
Video Codec mpeg4
Resolution original as input/ since no TV-out, it is suggest 640x480 up to 720x480.
Framrate Always try to maintain the same or near framerate as the original video clips. To see how many fram per second the original clip is...right click on the video clip and choose properties > Details > Frame Rate.
Aspect Ratio either Auto or 16:9.
Audio Quality 128 kbps
Audio Codec mpeg4aac
Channels 2 Channels Stereo
Sample Rate Varies/44100 Hz
NOTE: I can't be sure if its just my phone in particular, I've only gotten it for a week. I don't wanna think that it is my phone that don't play video perfectly. However, the setting above allow me to play full screen with no frame skips, hi quality, almost everything is perfect with 1 exception.
The frame don't skip but it seem to lagg a "LITTLE BIT"... an example, you won't see black strips or shifting from 1 scene to another, but rather.....crap i can't explain it lolz
So, as far as i can tell, on an HD dock, the device is outputting a native 1080p signal.
But the highest *resolvable* res is the native 540p of the device.
Is there any way to hack the framebuffer to be 1080p (possibly using a tablet version of ADW or other such launchers?), strictly during HDMI mirroring?
End goal: running 1080p ubuntu while docked + HD netflix and HD photo viewing.
My TV reports a 1920x1080 input resolution but for anything mirrored, it is just 960x540 being upconverted by the phone. Apps like picture and camera video players and youtube appear to be higher res since they can display the media only on the HDMI connected screen and on the phone black space will show.
If someone can figure out a way to get the phone to disable mirroring the displays and instead only use the HDMI connected display then you may be able to get it to output something in native 1080p without upconversion.
I don't know if this answers your question but i find turning my phone sideways puts the phone in full screen. youtube sucks though it looks like its all pixely
Whether it's Allcast, Localcast, Castaway, etc. None of these apps can do 1080p without stuttering. Is this a hardware limitation of the chromecast, or my phone?
The Chromecast is capable. As I write this, I am streaming Ender's Game in 1080p from Plex to my Chromecast. I have never had good results with AllCast however, and I'm guessing the case would be similar with other device-local content casting apps. My theory is that most Android devices aren't capable of the throughput necessary to support 1080p streaming locally. When uploading a test video from my Note 2 to my Plex server for testing, the best xfer rate I could get is just under 1MByte/sec, not really enough for 1080p streaming. Once uploaded, playing via Plex worked just fine.
siratfus said:
Whether it's Allcast, Localcast, Castaway, etc. None of these apps can do 1080p without stuttering. Is this a hardware limitation of the chromecast, or my phone?
Click to expand...
Click to collapse
The Chromecast itself is fully capable of 1080p playback. The issues lie in wireless bandwidth and video format (compression and container).
See WiFi Bandwidth and Router considerations and Supported Media for Google Cast
siratfus said:
Whether it's Allcast, Localcast, Castaway, etc. None of these apps can do 1080p without stuttering. Is this a hardware limitation of the chromecast, or my phone?
Click to expand...
Click to collapse
My way to figure out if a video will stream on my network is (this isn't a perfect science mind you, and this is as far as i know)
-Check my wireless connection by my chromecast (72 mbits/s with mine)
-Divide by 8 (that gives you mb/s)
-Check to make sure video source falls within that value
This would give me (in a perfect world) the ability to stream a 9mb/s video source. Don't forget to divide by 2 if the source of your content is wireless as well. In my case I have my netbook direct connected to the router so it's a non issue.
Someone please correct me if i'm wrong
sherdog16 said:
My way to figure out if a video will stream on my network is (this isn't a perfect science mind you, and this is as far as i know)
-Check my wireless connection by my chromecast (72 mbits/s with mine)
-Divide by 8 (that gives you mb/s)
-Check to make sure video source falls within that value
This would give me (in a perfect world) the ability to stream a 9mb/s video source. Don't forget to divide by 2 if the source of your content is wireless as well. In my case I have my netbook direct connected to the router so it's a non issue.
Click to expand...
Click to collapse
Sounds about right.
The only other limiting factors would be
your router's ability to sustain the wireless rate (lower-end routers sometimes only peak at advertised rates)
your device's ability to sustain the required rate
your device storage's ability to sustain the required read rate
Any video whose bitrate is higher than 5000 Kbits/s (that is: most 1080p videos) is likely to stutter due to not enough WiFi bandwidth available or too irregular. WiFi sucks compared to Ethernet.
For some reason I've noticed stuttering only on videos taken with my phone. I cautiously played 720p videos for a time because I thought the issue was that the vid was full hd. Turns out it wasn't as Thor 2 in 1080p played flawlessly for me. It helps to have the fastest speed your provider offers...in my case I have 30mb down which is reduced to around 20mb through my wall. I hope Google Fiber makes it's way to my town eventually.
I use serviio and avia. Don't have any stuttering on 1080p. I did have issues with AllCast and 1080.. but I tend to use AllCast with other software and do not have any issues as long as the video is below 1080.
For what it's worth "1080" isn't always the same "1080". It really comes down to the bitrate of a video. Native 1080p (ripped from a bluray) is something like 30MB/s. My s4 records at something like 15-20 MB/s. If you download a yify torrent that is "1080p" it'll tend to be around 4MB/s. As you can see there is a big difference (these are approximated numbers off the top of my head but you get the point). If you're having a problem with a video i would suggest a run through Handbrake and it'll play fine. My suggested settings are as follows:
High profile
Web optimized checked
Set Denoise under filter settings (more takes longer, I default to weak)
Choose your poison for encode speed and RF quality. (Over night I do very slow and usually a 19 if i'm looking for HD quality and a decent file size)
Under the audio tab make sure you're giving the result an AAC codec audio to work with. (I tend to bump the rate up to 256(
My guess is that this is the reason Google doesn't want to officially support local content. There are a lot of hurdles to jump and all content is not created equal. Someone streams an "HD" video on netflix and then thinks that they should be able to stream ANY "HD" content. Not the case as we're finding out
Speaking of yify (happy retirement), I get no stuttering from any of their mp4's but I do if I use bubble upnp. I don't think it's just limited to video bitrate, though that clearly does have an impact. I think the software used should also be considered as a possible chocking point. As I mentioned earlier, serviio has consistently given me the best results.
Great point. "1080" is just part of the story, "1080p" is a little more, but still not the full story. It's like calling a seamstress and asking them to make you a shirt, but only telling them "I'm male" or "I'm a tall male" - not unhelpful, but still not enough data.
A video file consists of:
Resolution
This is the stored or "captured" resolution, not necessarily the displayed size
Pixel aspect ratio (PAR)
The "shape" each data pixel should be displayed at. The combination of resolution and pixel aspect ratio determines the display aspect ratio (DAR), which is sometimes defined in place of PAR. Unfortunately pixel aspect ratio is not defined the same way in all formats. For MPEG and most containerless formats, it's defined by the CODEC itself. The AVI container does not have a place to store it, so AVIs will play assuming square pixels except Windows Media Player makes some assumptions about certain video frame sizes and tries to compensate (sometimes incorrectly).
Luckily, the HD and UHD resolutions use square pixels so there's less to worry about.
Field Order
Whether samples are full frames (progressive), or fields (interlaced) in upper/top field first (UFF/TFF) or lower/bottom field first (LFF/BFF) order. Sometimes you'll see field order referenced as "odd" or "even" field first, but this is ambiguous as some things label the upper field as field 0 (which would be even) while others label the upper field as field 1 (which would be odd)
Sampling rate
How many samples per second (ie, 50 Hz, 60 Hz)
Higher sampling rate = smoother motion. This is why 24 Hz content that isn't shot specifically for film rate (avoiding fast motion and fast pans/zooms) looks "jumpy" compared to "regular" video.
Bitrate
What the data rate is - usually stated in bits per second (bps), kilobits per second (Kbps) or megabits per second (Mbps)
This is what determines the size of the video portion, regardless of resolution, interlacing and sampling rate.
Bitrate and video quality go hand-in-hand. The more bits you have, the better each video frame will look.
Compression type (CODEC)
What format the video is compressed in, for example, MPEG-1, MPEG-2, MPEG-4, WMV, VP6, DivX, Lagarith, etc.
CODEC and bitrate go hand-in-hand as well. More-complex compression algorithms can provide better quality with a lower number of bits.
Container format
How the video is "wrapped" or packaged. Some formats like MPEG and Windows Media support multiplexing and can be self-contained, so they can exist outside of a container. Other formats usually exist in a QuickTime container (.mov file) or DirectShow/Video for Windows container (.avi file)
Elements from containers can be added and removed without impact to audio/video quality.
Audio compression type
Like video compression, what format the audio is compressed in, if any. Common formats include MPEG-1 Layer 3 (aka "MP3), AAC, Dolby Digital, etc. Audio can also be uncompressed LPCM, often referred to as just PCM.
Audio sampling rate
The number of audio samples per channel, per second - usually in kilohertz (KHz)
Audio sample depth aka bit depth
How large each audio sample is, usually stated in bits (8-bit, 16-bit, etc)
Audio bitrate
What the data rate is - usually stated in bits per second (bps), kilobits per second (Kbps) or megabits per second (Mbps)
This is what determines the size of the audio portion, regardless of channels, sampling rate and sample depth.
Bitrate and audio quality go hand-in-hand. The more bits you have, the closer the audio will sound to the source.
The overall size of the video portion is video bitrate x length of video in seconds
The overall size of the audio portion is audio bitrate x length of video in seconds
Add any additional metadata overhead and additional tracks (subtitles, etc) from the container (if applicable), and you have the total file size.
So "1080p" only says it's a 1920x1080 resolution, and progressive samples. It does not say what the bitrate is or display/sampling rate is.
This will be slightly off topic but worth noting...
sherdog16 said:
For what it's worth "1080" isn't always the same "1080". It really comes down to the bitrate of a video. Native 1080p (ripped from a bluray) is something like 30MB/s.
Click to expand...
Click to collapse
Your post is pretty spot on just wanted to note Full Native 1080P is actually a little more than 1Gb Bitrate over HD-SDI.
Almost no one but the production crew ever gets to see the full resolution, not even the Networks that will broadcast it unless you mail the tapes to them.
Everything else is compressed to hell including BluRay and 40Mb is about as high as you will ever see outside of the Master Tapes. And since most networks have decided NOT to support the HDCAM format in favor of XDCAM or digital storage (which are not much higher than BluRay quality and compressed) It's rare to ever see a full resolution 1080 signal in real life.
All these phones and such who claim to record in 1080P really only save in 1080P. Their CMOS doesn't have the resolution to properly capture 1080P Native at most it is 720 or 480 upconverted to a 1080 resolution file.
As for CCast and Wifi I would never go over 10MB on a source without transcoding. 4-8Mb is the sweet spot for WiFi transmission (IMO).
Unless your used to seeing full resolution 1080 signal your really not going to miss or gain much by going higher than that for your library. You wouldn't see a significant difference till you got up to 40MB which is a little higher than what your original source was. Going Higher than source does not bring back the resolution of the original so there is no point to it.
Most of my Library is encoded at 4-6Mbs in 1080P and I hardly ever have a problem streaming them to any device.
I think that you have a typo Asphyx.
Plenty of phones have CMOS sensors exceeding 2 MP (that's about all a single 1080p frame is), so it's not resolution holding that back, it's a chain of poor response times.
EarlyMon said:
I think that you have a typo Asphyx.
Plenty of phones have CMOS sensors exceeding 2 MP (that's about all a single 1080p frame is), so it's not resolution holding that back, it's a chain of poor response times.
Click to expand...
Click to collapse
Well yes and no...The CMOS may have 2MP (and some have higher than that) but two things are in play there....
1 - Some of those pixels are split between G, R and B so a 6 MP CMOS could be using 4MP for Green and 2MP each for Red and Blue. so a 2MP Camera is probably not really getting full HD. 6MP would be the minimum for full 1080P.The old 4:2;2 standard
But more importantly is:
2 - Most video capture is not using the entire CMOS to capture image due to the 16:9 ratio of HD capture.And thats not so much about the CMOS as it is the lensing system.
In broadcast we use Three 3/4" CCDs or CMOS chips one for each color with a prism to split and send the color to each chip. Each chip is full resolution so we get 4:4:4 and every color is captured at full resolution.
Because of the lensing and focal length, the image reflected on these chips is very large compared to what is reflected on a Phone CMOS so the image is a lot clearer. less fuzz and better pixel resolution. In broadcast we shoot higher than HD as we have an overscan sized signal and we cut out the HD bit we need when recorded.
So yes Phones have the Pixels needed but in most cases they are not in the right place for full HD resolution. And due to the short focal length they rarely ever use the entire chip.
Thanks, I'm very familiar with the RGBG Bayer filter, for those that aren't - http://en.wikipedia.org/wiki/Bayer_filter
As for the 2 MP thing - I didn't mean to imply that a 2 MP sensor would take 1080p vids and no one making a phone claiming 1080p uses such a low MP-count sensor.
Smallest I know of is the HTC One at 4 MP and that's 16:9 all of the time, most everything else is 5, 8, 12 or more MP.
So, on that basis, allowing for the Bayer filter, lower quality without oversampling, and 16:9 masking, I'll maintain that the problem on the top end phones claiming 1080p video isn't resolution - it's response time.
I'm familiar with 3-chip cameras, I used to own a Canon XL-1 (SD obviously), and I'm way too familiar with CMOS and CCDs at the silicon level.
The CMOS mobile sensors are noisy, not terribly sensitive and s.l.o.w. They're price effective but they're just not CCDs.
You can dial in a higher bit rate for many Androids, especially with root options, that's probably the darling camera app mod - but you won't get faster than the sensor response time + readoff time + binning time + processor time of attention (usually an image processor in the main SoC, but sometimes a CPU core) + the frame rate processing algorithm time + compression time + plus whatever else I forgot.
And that's why phone videos stutter. When the system can't keep up, it simply lowers the fps rate.
The new crop is promising higher frame rates. We'll see.
As for frame quality - that's affected by all of the things you mentioned (and let's toss in inaccurate color rendering and plastic lenses for those without an iPhone while we're at it).
1080p can be done, sufficient phone sensors exist in terms of MP, and you can wind up the Mbps - but you can't cure light sensitivity and noise and what most people shoot slows down an already slow subsystem.
Edit - posting this made me think - so I went and checked my video closet - I actually still have a 3CCD Canon GL1 that I completely forgot about. rotflmao - I'll have to dust it off and see what I get.
I agree with you that the speed is a problem as well...
But when push comes to shove at some point phones (and CMOS) will catch up and we won't have to wonder if a particular model is true HD or not.
A recently as a year or two ago HD Record was more of a Marketing pitch than a reality.
Phones (and their camera's) have improved a lot since then and we even have a few Cameras with phone being made where the Camera and lensing is prioritized to get better picture.
It's something I expect us to tell our kids about the good old days when HD cameras in phones weren't really HD! LOL
They won't believe us!
EarlyMon said:
I think that you have a typo Asphyx.
Plenty of phones have CMOS sensors exceeding 2 MP (that's about all a single 1080p frame is), so it's not resolution holding that back, it's a chain of poor response times.
Click to expand...
Click to collapse
EarlyMon, you do have too much knowledge for the human being.
I feel embarrassed.
Since I got CC mirroring feature on my Nexus 7 2013 LTE running stock 4.4.3 the quality seems very good.
Compared to another Miracast device, I suspect the CC mirroring is *better* resolution.
Does anyone know the actual CC resolution specs of the signal to be displayed?
Or, is there some test I can do with my monitor and figure out for myself?
I've actually been wondering this myself...
Chromecast itself outputs 480p, 720p and 1080p (not sure about 1080i) - that's what the TV receives, because it's an HDMI video device, as opposed to a computer HDMI output.
So whatever resolution file or stream is being displayed needs to get up- or down-scaled to one of the supported resolutions.
The core question, what resolution is the image, before scaling, that is to be sent to Chromecast?
To test this, we need a video file that will show scaling. A frame full of alternative black and white horizontal lines is usually enough. So, to test if it's putting out 1080p, you make a 1920x1080 file with alternating black and white horizontal lines (540 white, 540 black) then display it on Chromecast to a 1080p TV.
Replace 1080p with 720p if you only have a 720p TV. If you have a TV that is not native 480p, 720p or 1080p you can't really do this test unless the TV supports a 1:1 mode that bypasses scaling.
If the image displays as a solid single shade of gray (or if you have sharp eyes, alternating single pixel black and white lines) then there's no scaling and Chromecast is receiving a 1080p resolution signal.
If you see "bands" of gray or fat lines, there is scaling going on, and Chromecast is not receiving a 1080p signal.
I'll try to do a test...
Managed to do a quick test... Basic screen mirroring seems to happen at screen resolution (makes sense).
One thing I'd like to test but couldn't is playback of a video file using the native Video Player app (which displays a screen cast icon during casted playback). I suspect it just sends the video data directly to Chromecast in this (special) case.
Most of the models that support Mirroring already use at least 720 resolution for their screens so it may not need to do any upconverting at all.
Asphyx said:
Most of the models that support Mirroring already use at least 720 resolution for their screens so it may not need to do any upconverting at all.
Click to expand...
Click to collapse
All depends... I don't think (or at least haven't noticed) Chromecast changes resolution once it "decides" so even if the device is sending 720p, if Chromecast it at 1080p, something somewhere needs to upscale.
Okay, did a follow-up test with a Handbrake-converted (native Video Player can't handle TS) version of the Alternating black/white 1 pixel full field 1920x1080 from www.w6rz.net.
I also used Overscan lines at 0, 2.5 and 5.0 percent with 0 to 16 pixel cropping bars 1920x1080 to verify what was happening in different modes since the native Video Player doesn't explain (damn icon-based reality, this is why I don't like toolbars!).
On the TV...
Avia:
Shows proper single-pixel lines on Chromecast
This serves as the control image for full 1080p as Avia just sends the native video over without scaling.
MX Player:
100% shows proper single-pixel lines on Chromecast, and on my phone single-pixel lines using SW decoder, and banded gray on my phone using HW and HW+ decoder (which is odd - points to an anomaly/defect in the decoder, perhaps)
Fit to Screen shows even, fat lines
Stretch shows banded fat lines
Crop shows banded fat lines
Video Player:
Two corners (Fit?) shows even, fat lines (same image as Fit to Screen in MX Player)
Letterbox shows even, fat lines (same image as Fit to Screen in MX Player)
Four corners (Stretch?) and Full (Crop?) show banded fat lines
If the native Video Player did, as I had hoped, output the full native resolution to Chromecast, I should have seen proper single-pixel lines, looking gray from a distance, but alternating black/white up close.
Sadly, Video Player does not seem to do this, sending the device's native resolution, or some derivative. This makes me wonder why Video Player blanks out the screen during screen casting - casting other content works fine with video going to the screen and Chromecast. It might be to ensure the best framerate or something, but seems odd to me.
I don't know what magic they implemented, but my HTC One can be mirrored without any noticeable loss of picture quality. Compare that to the casting of the screen of a powerful i7 PC that lags heavily every time I used it (little bit better with Canary).
Either some frames are dropped to keep up, or the bitrates are lower but whatever they did I like it.
jasenko said:
I don't know what magic they implemented, but my HTC One can be mirrored without any noticeable loss of picture quality. Compare that to the casting of the screen of a powerful i7 PC that lags heavily every time I used it (little bit better with Canary).
Either some frames are dropped to keep up, or the bitrates are lower but whatever they did I like it.
Click to expand...
Click to collapse
Especially head-to-head against Miracast (Samsung AllShare Cast Wireless hub), I prefer the Chromecast implementation - no noticeable picture break-up and framerate feels more consistent.
Some of the articles say Google wrote their own framework for it. Given that it also seems to work quite nicely on my Galaxy S3 which is a lot less horsepower than the Galaxy S4, I'm very impressed. Even for streaming Internet video (Xfinity TV Go) it's working very well. My only problem is my phone overheats on long streaming sessions (over an hour) and the battery stops charging, heh. Not bad considering the S3 isn't officially supported to begin with.
@bhiga
I would think the scaling happens in the loaded APP and not the CCast itself which would explain what you are seeing...
The TV doesn't really change it's res based on what the CCast is being sent the App loaded does the upscaling and thats why it looks as bad as it does,
Kind of like taking a 640X320 video and displaying if full screen on an HD TV!
So it's not a hardware thing just a software thing and doubles pixels as needed.
Asphyx said:
@bhiga
I would think the scaling happens in the loaded APP and not the CCast itself which would explain what you are seeing...
The TV doesn't really change it's res based on what the CCast is being sent the App loaded does the upscaling and thats why it looks as bad as it does,
Kind of like taking a 640X320 video and displaying if full screen on an HD TV!
So it's not a hardware thing just a software thing and doubles pixels as needed.
Click to expand...
Click to collapse
Yes, the Chromecast->TV resolution stays fixed AFAIK.
I'm not sure whether Chromecast reports its negotiated resolution back to the the framework or app using it, which is why I think it just happens in the Chromecast hardware - after all, you can send Chromecast random-sized "raw" video in a supported compression and it'll still scale up/down as necessary. Doesn't really matter where/how it happens, though.
The question here was whether Chromecast screen casting took the image at the device's screen resolution, or if it somehow created a "virtual" screen at whatever resolution Chromecast is sending to the TV and rendered directly to that as well.
I didn't expect it to do so for non-video stuff, and it doesn't. It's essentially just sending the screen buffer to Chromecast as well as to the device's screen.
The native Samsung Video Player*, however, does not show the video on the device while screen casting (so it's not sending the screen's buffer) - it only shows it on Chromecast, which made me hope that perhaps it was sending the full-resolution video, rather than the scaled version for the device's screen. Unfortunately that does not seem to be the case.
So the core story seems to be
Screen cast quality is dependent on the device' screen resolution.
Higher screen resolution = higher quality screen casted image, up to 1080p.
Lower screen resolution = lower quality screen casted image. If your device's screen resolution is less than the resolution Chromecast is outputting to your TV (1080p or 720p), your screen casted Chromecast output will be degraded.
The native Samsung Video Player does not show the video on the device screen while screen casting, but that does not seem to change the image resolution. In other words, on a device with a 720p screen, playing a 1080p video while screen casting yields a 1080p original -> 720p scaled down for screen -> 1080p scaled up for TV, rather than 1080p original -> 1080p for TV
* Note that this is based on AT&T Samsung Galaxy S3, so the "native Video Player" is the Samsung Video app from TouchWiz. Behavior may be different in other ROMs that use a different video player. I just found it curious that the native Samsung Video did not show the video image during playback. Other video apps like MX Player screen cast fine while showing the video on both Chromecast and the device's screen.
VLC shows video controls while casting the video, very convenient.
Great discussion and research. Thanks.
----------
I ran a test with Adobe Reader and FBReader to view a .pdf file via CC mirroring on my HDMI monitor. I reduced the font size until they were unreadable on the Nexus 7 (2013) LTE, but they were still very clear on the HDMI monitor. Generally, I still did not find a solution to read side-by-side pdf documents on my HDMI monitor. LOL
During this test I noticed text display done with a *2* pass screen draw on the HDMI monitor. Makes me think Google CC mirroring implemented some sort of 2 pass resolution: e.g. pseudo algorithm might go:
Pass 1) draw reduced resolution, if static image go to pass 2 else dynamic image (movie) loop to next frame.
Pass 2) draw fill to enhance resolution of static image.
----------
Another test might be comparing same image displayed two ways.
(1) First screen cast via CC compared to
(2) second using an application that's capable of *true* CC
(where *true* - CC fetches its own image and displays full resolution. I'm also assuming full resolution is 1080?)
My phone's resolution is 540×960 and my TV's resolution is 1080×1920. The picture looks extremely bad in the TV. Is there a way to get better quality in the TV? Also the response time is almost half second. Can I reduce it somehow? For video it's not much of a problem but for browsing and games, it's frustrating
Thanks in advance
Edit: I use ClockworkMod's "Screen Recording and Mirror" and "AllCast receiver"
newuser156 said:
My phone's resolution is 540×960 and my TV's resolution is 1080×1920. The picture looks extremely bad in the TV. Is there a way to get better quality in the TV? Also the response time is almost half second. Can I reduce it somehow? For video it's not much of a problem but for browsing and games, it's frustrating
Thanks in advance
Edit: I use ClockworkMod's "Screen Recording and Mirror" and "AllCast receiver"
Click to expand...
Click to collapse
There's an app called second screen on the play store. It lets you boot your device using another screen resolution. Root required. :good:
kawutzel said:
There's an app called second screen on the play store. It lets you boot your device using another screen resolution. Root required. :good:
Click to expand...
Click to collapse
I tried it and it looks even worse. I noticed that for some reason things looks better if I zoom