Is the Note 9 still stuck on Samsung's version of Quickcharge 2.0? I wonder why they haven't gotten faster charging especially since the battery is bigger.
Based on what i've found out so far, it is because they want to play safe so the batteries... you know don't explode again. I also think they are being too conservative about this.
Charged plenty fast enough for me from 33 to 100. a bit over an hour.
it' really sad when you pay 1000$ for a phone and getting year 2000 technology quick charge 2.0
that's like buying a 70,000$ sport car that goes 0-60 in 8 seconds. WTF
quick charge 4 - Charge up to 4X faster than conventional charging. 5 minutes of charging gets you 5 hours of talk time.
https://www.qualcomm.com/solutions/mobile-computing/features/quick-charge
netnerd said:
it' really sad when you pay 1000$ for a phone and getting year 2000 technology quick charge 2.0
that's like buying a 70,000$ sport car that goes 0-60 in 8 seconds. WTF
quick charge 4 - Charge up to 4X faster than conventional charging. 5 minutes of charging gets you 5 hours of talk time.
https://www.qualcomm.com/solutions/mobile-computing/features/quick-charge
Click to expand...
Click to collapse
This is exactly what I'm talking about. They even have Quickcharge 4+. I've never understood that even before the Note 7 debacle, we were still dealing with older technology on charging. Even with 4+, no phone has it yet. I wonder why not. Even if Samsung would go to at least a faster charging such as 3.0 would be better.
"QC 4+ has three major improvements. The first is 'Dual charge,' which divides the charge current across two power management ICs. Qualcomm says this should result in reduced charge time (approximately 15%) and lower thermal dissipation.
The second new feature is 'Intelligent Thermal Balancing,' which moves the electric current through the coolest path automatically, eliminating potential hot spots. Finally, QC 4+ includes a handful of new safety features, such as monitoring both the device case and connector temperature levels at the same time."
That's Samsung. Like fingerprint position: first place at stupid position then put it back middle and they call improvement . Next gen you will get QC3 or maybe QC4 as one of selling points
Anyway charging time about 1h 50m is not too bad ( but not worth for 1k flagship )
They need to leave some good things for the S10 to make it feel more revolutionary
xMadMike said:
They need to leave some good things for the S10 to make it feel more revolutionary
Click to expand...
Click to collapse
Well, I think not..... maybe for the S20
Why abuse the battery? These should have 5000 25/85 charge cycles and on slow cable charging it only takes 90 minutes for a full day & night of use!
I keep my wireless charger handy but only needed it 3 times in 3 months
And mine pretty much runs a small business
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Photo's[emoji2398] by Sully using SM-N960U or SM-870A
While on the subject, what is the fastest way to charge the Note 9 based on your experiments? For me, I have found that a 45w USB-PD Car Charger and the J5 Create USB-C cable is the fastest I have found so far. I'd be curious to know what everyone else prefers.
I think that this maybe due to Samsung's Own Adaptive Fast Charging system. Which isn't the best.
robroy90 said:
While on the subject, what is the fastest way to charge the Note 9 based on your experiments? For me, I have found that a 45w USB-PD Car Charger and the J5 Create USB-C cable is the fastest I have found so far. I'd be curious to know what everyone else prefers.
Click to expand...
Click to collapse
What is max. mA you are getting with that setup? I usually get 2000-2200 on Motorola 30W TurboCharger.
Why would they waste time pursuing Qualcomm Quick Charge? Qualcomm QC is a their SnapDragon proprietary standard, just like CDMA. Google has pushed for a long time for a standard power interface. USB-PD takes QC technology and moves it to the USB level with a standardized OS interface, and if you read how it works, it works exactly like QC except the voltage requests are 10x more granular than QC 3.0+. Samsung themselves have both SnapDragon and Exynos processors. They don't want to keep up a different driver and different hardware for every phone either. USB-PD has been on the Note since at least Note 8 and maybe before. It's on iPhones, etc. Pretty much anything with a USB-C except for maybe the first devices, are going to use USB-PD. USB-PD can do 100 watts (20v * 5a), and power laptops and tablets. The phone can limit the power to what it wants. 27w capable USB-C car chargers are common. 65w wall chargers are common. It has been rolled out on Power Banks and you'd be silly not to get one without it.
Edit: USB-C and USB-PD is bi-directional. Because it implemented at the USB level, you can fast charge your Power Bank with a USB-PD wall charger, and fast charge your phone from the same USB-C port. (Of course you can't do pass-thru if you are using the USB-C port to charge the Power Bank.)
IT_Architect said:
Why would they waste time pursuing Qualcomm Quick Charge? Qualcomm QC is a their SnapDragon proprietary standard, just like CDMA. Google has pushed for a long time for a standard power interface. USB-PD takes QC technology and moves it to the USB level with a standardized OS interface, and if you read how it works, it works exactly like QC except the voltage requests are 10x more granular than QC 3.0+. Samsung themselves have both SnapDragon and Exynos processors. They don't want to keep up a different driver and different hardware for every phone either. USB-PD has been on the Note since at least Note 8 and maybe before. It's on iPhones, etc. Pretty much anything with a USB-C except for maybe the first devices, are going to use USB-PD. USB-PD can do 100 watts (20v * 5a), and power laptops and tablets. The phone can limit the power to what it wants. 27w capable USB-C car chargers are common. 65w wall chargers are common. It has been rolled out on Power Banks and you'd be silly not to get one without it.
Edit: USB-C and USB-PD is bi-directional. Because it implemented at the USB level, you can fast charge your Power Bank with a USB-PD wall charger, and fast charge your phone from the same USB-C port. (Of course you can't do pass-thru if you are using the USB-C port to charge the Power Bank.)
Click to expand...
Click to collapse
so where are you getting that CDMA is proprietary tech from Qualcomm again ...lol?
Well of course CDMA (Code Division Multiple Access) in name describes a technology that predates cell phones. When it comes to cellular, Qualcomm holds the key patents to the practical deployment of it on cell phones. CDMA was form more widely implemented than people would have us believe. On the other hand, it is not difficult to understand why European countries, European Telecommunications Standards Institute (ETSI), would not want to be beholden to Qualcomm or any other single company, and this became the charter and power behind GSM. GSM was legislated so it wouldn't have to compete on its merits. GSM architecture promotes interoperability while Qualcomm ties the hardware to the carrier, which the carriers liked a lot. Example: You want to switch from Sprint to Verizon? No problem. All you need to do is pay off the phone on your current contract plus $200-$400, get into another contract with Verizon, and amortize another $1000 phone because the other phone can only register on the Sprint network. They still had "World Phones" with a SIM slot so you could travel with the phone, but they didn't work with domestic GSM bands, so it isn't like you could use it with Cingular/AT&T or T-Mobile. Similarly, Clearwire developed Wi-Max, and it was being rolled out before LTE was defined. Sprint and Verizon saw the potential to dominate that market. However, as with IDEN, Sprint the once dominant carrier, once again demonstrated its unusual capacity of transforming a silk purse into a pig's ear by buying Clearwire, which caused Verizon to bolt in support of LTE, after which Sprint realized they had no chance of pulling this off without Verizon, and Wi-Max became a niche product while the world waited years for LTE to be defined, developed, and rolled out. LTE uses CDMA (Code Division Multiple Access).
Qualcomm protects their huge investments and risks vigorously, as they should. Apple sued them for overcharging for chips and failing to pay $1 billion in rebates. Apple thought they had won. Afterward, Apple wanted to use Qualcomm modems in the iPhone XS and XR but Qualcomm refused to sell them after Apple sued over its licensing practices, leading to Apple using Intel's modems instead. Then Intel decided they couldn't compete, and wouldn't invest in the 5G game. Thus, just last month, Apple came back to Qualcomm and paid an undisclosed some back to Qualcomm, and entered 6 year licensing agreement. The problem for people is they cannot develop their own for what Qualcomm licenses for. A while back the European Commission fined them for Anti-Trust, as did Korea before that, and it has happened elsewhere as well. They are simply very good at what they do, and technologies have synergies that sell each other. Qualcomm is a highly competent ecosystem that is not heavily legislated against in the US, and what they did to implement quick charging by negotiating a higher voltage outside of the USB 5v standard was well done. Another example of the ecosystem making the sale is Windows. Windows was brought to market by being the graphical subsystem of the Word and Excel install. Word and Excel, with their WYSIWYG, wildly outclassed the de facto standard bearers of that day, Word Perfect and Lotus 123.
As with any industry, as things mature, proprietary technologies give way to open standards when they can, and charging is no different. Steve Balmer, against the screaming of developers and users, killed Microsoft's mobile market when he shut down Windows Mobile, which had millions of very useful apps, including Office, and introduced a gaming operating system, Zune/Windows Phone, with no real apps, AFTER the iPhone came to market, and tried to do an Apple store where only Microsoft could make good apps. (Zune/Windows Phone/Windows RT/Whatever, is the OS that runs metro/modern/universal apps.) That in turn caused investments by cell phone companies in the only other hardware vendor independent phone OS with any potential, Android, and most of the Microsoft app developers went to Android. Together, they developed Android into what it is today. Google worked with, and often pressured, vendors to donate their code to the core OS. Thus, Android became the Windows of the mobile device world with many hardware vendors, and put Google is in the position to help bring focus to standardization in this area.
USB-PD has been developed to fulfill the charging role and dovetails nicely with USB-C's extra power capacity. Everyone is happy to see the new reversible connector, that can also carry more power, and the timing of the new connector can be tied in people's minds to the new and open standard charging technology, even though technically, that connection does not exist. As power requirements have dropped, it this was perfect timing for USB-PD and USB-C to become the charging standard for everything from phones to laptops. Of course USB-C has many other things it standardizes such as high speed data, data cables for monitors, etc. While the Power Delivery spec finalized in 2012, it wasn't until April of 2016 before chips available to support USB PD in phones, by which time PD was in its second revision. The current, and 3rd revision. PD 3.0 has worked to improve its power delivery but is primarily known for offering increased amounts of information about the device being charged and its power/ battery. This includes reporting any malfunction or system change within the device, temperature of the device, or what may be causing hold ups or decreased charging speed, etc. For example, if an over-current or over-voltage occurs while charging, your device will notify you with a data message of the mishap. There was an effort to rule that Qualcomm's technologies were not compatible with USB-C, but Qualcomm won that argument. Also, the latest, Qualcomm charging technology, QuickCharge 4.0, is built on USB-PD. Another advantage of USB-C is it can work both ways. For instance, power banks take a long time to charge. If it has a USB-C port, you can rapidly charge that great big battery, and then use the same charger as the power supply for your devices, all without trying to match cables and technologies.
These are some of the reasons it makes sense to me that Samsung wouldn't be thinking about QC, especially after they already deployed USB PD in their hardware by the time the Note 8 came out. Personally, the faster I get rid of the non-USB-C devices, the better.
IT_Architect said:
Well of course CDMA (Code Division Multiple Access) in name describes a technology that predates cell phones. When it comes to cellular, Qualcomm holds the key patents to the practical deployment of it on cell phones. CDMA was form more widely implemented than people would have us believe. On the other hand, it is not difficult to understand why European countries, European Telecommunications Standards Institute (ETSI), would not want to be beholden to Qualcomm or any other single company, and this became the charter and power behind GSM. GSM was legislated so it wouldn't have to compete on its merits. GSM architecture promotes interoperability while with Qualcomm is tied to the carrier, which the carriers liked as well. Similarly, Clearwire developed Wi-Max, and it was being rolled out before LTE was defined. Sprint and Verizon saw the potential to dominate that market. However, as with IDEN, Sprint the once dominant carrier, once again demonstrated its unusual capacity of transforming a silk purse into a pig's ear by buying Clearwire, which caused Verizon to bolt in support of LTE, after which Sprint realized they had no chance of pulling this off without Verizon, and Wi-Max became a niche product while the world waited years for LTE to be defined, developed, and rolled out. LTE uses CDMA (Code Division Multiple Access).
Qualcomm protects their huge investments and risks vigorously, as they should. Apple sued them for overcharging for chips and failing to pay $1 billion in rebates. Apple thought they had won. Afterward, Apple wanted to use Qualcomm modems in the iPhone XS and XR but Qualcomm refused to sell them after Apple sued over its licensing practices, leading to Apple using Intel's modems instead. Then Intel decided they couldn't compete, and wouldn't invest in the 5G game. Thus, just last month, Apple came back to Qualcomm and paid an undisclosed some back to Qualcomm, and entered 6 year licensing agreement. The problem for people is they cannot develop their own for what Qualcomm licenses for. A while back the European Commission fined them for Anti-Trust, as did Korea before that, and it has happened elsewhere as well. They are simply very good at what they do, and technologies have synergies that sell each other. Qualcomm is a highly competent ecosystem that is not heavily legislated against in the US, and what they did to implement quick charging by negotiating a higher voltage outside of the USB 5v standard was well done. Another example of the ecosystem making the sale is Windows. Windows was brought to market by being the graphical subsystem of Word and Excel install. Word and Excel, with their WYSIWYG, wildly outclassed the de facto standard bearers of that day, Word Perfect and Lotus 123.
As with any industry, as things mature, proprietary technologies give way to open standards when they can, and charging is no different. Steve Balmer, against the screaming of developers and users, killed Microsoft's mobile market when he shut down Windows Mobile, which had millions of very useful apps, including Office, and introduced a gaming operating system, Zune/Windows Phone, with no real apps, AFTER the iPhone came to market, and tried to do an Apple store where only Microsoft could make good apps. (Zune/Windows Phone/Windows RT/Whatever, is the OS that runs metro/modern/universal apps.) That in turn caused investments by cell phone companies in the only other hardware vendor independent phone OS with any potential, Android, and most of the Microsoft app developers went to Android. Together, they developed Android into what it is today. Google worked with, and often pressured, vendors to donate their code to the core OS. Thus, Android became the Windows of the mobile device world with many hardware vendors, and put Google is in the position to help bring focus to standardization in this area.
USB-PD has been developed to fulfill the charging role and dovetails nicely with USB-C's extra power capacity. Everyone is happy to see the new reversible connector, that can also carry more power, and the timing of the new connector can be tied in people's minds to the new and open standard charging technology, even though technically, that connection does not exist. As power requirements have dropped, it this was perfect timing for USB-PD and USB-C to become the charging standard for everything from phones to laptops. Of course USB-C has many other things it standardizes such as high speed data, data cables for monitors, etc. While the Power Delivery spec finalized in 2012, it wasn't until April of 2016 before chips available to support USB PD in phones, by which time PD was in its second revision. The current, and 3rd revision. PD 3.0 has worked to improve its power delivery but is primarily known for offering increased amounts of information about the device being charged and its power/ battery. This includes reporting any malfunction or system change within the device, temperature of the device, or what may be causing hold ups or decreased charging speed, etc. For example, if an over-current or over-voltage occurs while charging, your device will notify you with a data message of the mishap. There was an effort to rule that Qualcomm's technologies were not compatible with USB-C, but Qualcomm won that argument. Also, the latest, Qualcomm charging technology, QuickCharge 4.0, is built on USB-PD. Another advantage of USB-C is it can work both ways. For instance, power banks take a long time to charge. If it has a USB-C port, you can rapidly charge that great big battery, and then use the same charger as the power supply for your devices, all without trying to match cables and technologies.
These are some of the reasons it makes sense to me that Samsung wouldn't be thinking about QC, especially after they already deployed USB PD in their hardware by the time the Note 8 came out. Personally, the faster I get rid of the non-USB-C devices, the better.
Click to expand...
Click to collapse
thanks for defining the difference and of course the enlightenment!
bober10113 said:
thanks for defining the difference and of course the enlightenment!
Click to expand...
Click to collapse
You are welcome. It was too bad that we had to wait an extra 5 years and still not have in some ways what Wi-Max had, but the silver lining is 2G and 3G CDMA is about gone in the US, we have GSM-like phone portability domestically and world-wide and across cellular network providers, and VoLTE for clearer digital voice and competition causing cellular providers to get away from trying to force us to buy our phones from them to get VoLTE and aggregation if the device has the bands to do it. It's too bad we lost Windows Mobile, which was basically XP in your hand, but we did end up with an OS whose direction is dictated by the hardware vendors and software developers that one company's vision. It's also nice that with USB-C, one reversible cable able to be used to charge Power Banks, charge from Power Banks from the same port, charge phones, run/charge a laptop, or tablet with up to 100 watts (20v*5a), connect a printer, digital camera, supports USB 3.2 20 gbps, Thunderbolt 3, an USB 4 at 40 gbps, Display Port 1.2 and 1.4, HDMI, MHL and Super MHL as used in TVs, set-top boxes, DVD players, streaming sticks and gaming consoles, and there are already adapter cables for most of the peripherals that you already have that use the old connector. Now that the future is here, we need to catch up to it, and once again the hardware vendors will be more than happy to help us. LOL!
cushcalc said:
What is max. mA you are getting with that setup? I usually get 2000-2200 on Motorola 30W TurboCharger.
Click to expand...
Click to collapse
I get around 2200 from 45W usb-pd
Charges little faster than the one that comes with the phone.
Sent from my ONEPLUS A5010 using Tapatalk
Related
Lapdock+Wii == Gametrix
So I have a spare Nintendo Wii and a lapdock (hopefully pick some more up if there are still any at Radioshack) and I am going to disassemble the Wii, reconfigure it to fit on the back "panel" of the Lapdock, and get the needed cords to create a Gametrix[/]
My initial goal is to connect a Nintendo Wii to my Atrix Lapdock by... (with modifications)
1) a. Having the Lapdock’s Male Mini-HDMI plugged into a Female Mini-HDMI TO M / F Fullsized-HDMI converter. Thus allowing a direct HDMI connection from there.
This takes care of ½ of connections for both the Lapdock and the Wii.
2) a. Ideally I am looking to hook the Wii’s power supply directly to the Lapdock’s Male Micro-USB port, via a Female Micro-USB TO M / F Fullsized-USB. From there I’d need a USB to Female Wall Outlet (3 prong?) this would entail finding(unlikely...) / building one.
This takes care of power needs, (if it works) leaving only the unlikely usability of the Lapdocks built in keyboard and mouse / USB ports.
2) b. if number 2 section a (above) doesn’t work because of lack of volts traveling through the USB to the Wii, then I’d be left with connecting an external battery to the Wii and just taking advantage of the screen.
Hopefully if I have to result to section b then I’d at least be able to use the keyboard and mouse?
I will update with a Diagram of my plan “a” and “b” tomorrow after school.
Here are average power draw for the Wii. @17 Watts http://www.blogcdn.com/www.joystiq.com/media/2007/02/next-gen_console_power_lg.jpg
And here for Gamecube. @23 Watts http://www.tpcdb.com/product.php?id=1615
Lapdock voltage output. ??? I think I'd have to replace the battery because it's only supposed to charge a phone/run an OS...
Lapdock insternal battery mAh ???
Please guys I know it's a lot but any input is great input
sounds fun. good luck dude.
Use a wii its better and supports a HDMI converter.
The gamecube already has a battery pack accessory and a screen accessory so its easy been done
Sent from Moto Atrix 4g on Neutrino 2.91
I have decided to go with the Wii, I am trying to find info on how many amps the Wii draws and if the Lapdock is capable of powering it...
I think I would try seeing how they both look on the screen before diving in much further, but it does sound like it could be a fun project.
Other thoughts:
how useful the project will be specifically to you? For many of us, due to the limited availability of the lapdocks, there's a limited number of people that will be able to try this themselves.
do you plan on strictly playing games, or are you going a bit further by using Linux on the device? I imagine there is a way you could use the keyboard as some type of input device, although you may have to create some translator device from a programmable Microcontroller.
is the screen big enough for enjoying using the device or is it more of a challenge than it's worth. The screen may be plenty big enough for a handheld device, but for something like using a Wii controller where you are at a distance from the screen, is it big enough?
Budget This is something we all overlook far to often. Something starts off small and simple, but before you know it, you end up spending way more than you had anticipated.
I have 2 Wii's and a Lapdock, so budget is covered.
I'm not going to use the motion bar, just gamecube games.
My main concern is if the Lapdock can power it.
jeffreygtab said:
I have 2 Wii's and a Lapdock, so budget is covered.
I'm not going to use the motion bar, just gamecube games.
My main concern is if the Lapdock can power it.
Click to expand...
Click to collapse
Power will probably be an issue through USB as USB does have a specification for max current of 500–900 mA (general); 5 A (charging devices). After that the port should shutdown to prevent burning out the controller.
Edit: I was looking at your figures above, did some digging, and I think you have a couple hurdles.
To start with, for power consumption, you want to look more at peak then you do at min and give yourself a bit of cushion, mainly because running at max power all the time will tax components. Second, and this is a big one, the Wii power pack apparently is 12V 3.7A (44.4W). USB is only 5 volts, and at 900mA you're peaking at 4.5W. However, if you could somehow manage to trick it into charge mode, you might be able to squeeze 25W. That's if it works like a standard USB port. If you can do that, you can step up the voltage with a charge pump, but I'm not quite sure how close to max that will put you due to efficiency losses. You may be able to go the other way and use the Wii to power the lapdock, or you may have to power them independently.
All that said, I think it's still important to just try to see what it looks like on the screen before digging in too far.
I plan on charging it through the Micro USB which you said supplies up to 23W+ Which is apparently plenty for the Wii. I'm going to order the necessary cords to attempt this, this weekend. So next week I'll know how much, if any, I have to modify to power it.....
I'll continue to research, and thanks for your help!
EDIT: screen size isn't an issue, as I'm have it right in from of me like a laptop, Playing Super Smash Bros Melee and the like.
BIG Message to everyone who's reading this. THIS IS MY FIRST HARDWARE MOD (as if that wasn't obvious?)
Anyways I'm gonna pick up a soldering iron as well, because I'm not finding a way to charge the Wii (assuming the Lapdock is capable...) There are no real ways to convert the micro usb to the 3 prong standard outlet that the Wii uses... So I may need to make my own? Will this work? http://www.sybausa.com/productInfo.php?iid=1274 Although I can't find where to buy it.
jeffreygtab said:
BIG Message to everyone who's reading this. THIS IS MY FIRST HARDWARE MOD (as if that wasn't obvious?)
Anyways I'm gonna pick up a soldering iron as well, because I'm not finding a way to charge the Wii (assuming the Lapdock is capable...) There are no real ways to convert the micro usb to the 3 prong standard outlet that the Wii uses... So I may need to make my own? Will this work? http://www.sybausa.com/productInfo.php?iid=1274 Although I can't find where to buy it.
Click to expand...
Click to collapse
You mean you want to convert 5V DC to 120V AC and then downconvert to 12V? Better to just go from 5V to 12V, but the reality is that it's much easier to go down than up. What does the lapdock itself have for a power supply?
Edit: Looks like the Wii has some type of USB keyboard support. Not sure if you want to try getting that to work, but it might come in handy.
I know but isn't the Wii's power cord a standard 120v 3 prong wall charger? I'd have to convert the 3 prong format to a Micro USB.
The lapdocks power supply is the battery if that's what you were asking...
Thanks for helping on my first project btw.
Check out about halfway down the page on this link if you want to see what the Wii looks like on the lapdock screen:
http://www.robpol86.com/index.php/Atrix_Lapdock_Other_Uses
jeffreygtab said:
I know but isn't the Wii's power cord a standard 120v 3 prong wall charger? I'd have to convert the 3 prong format to a Micro USB.
The lapdocks power supply is the battery if that's what you were asking...
Thanks for helping on my first project btw.
Click to expand...
Click to collapse
I'm glad to share my limited knowledge. Anyway, Really, I'm unsure if you can pull 25W out of the USB or not, but even if you could, you'd lose a good chunk of that in going from 5V to 125V AC, because at this point, a charge pump is no longer an option but instead you would need a power inverter, and since most of the commercially available ones are designed to go from 12V DC to 120V AC, you would likely end up building one yourself. The charge pump (buck–boost converter) is much easier to build, but I'm not sure about how much power you can get out of it.
Here's one I built from modifying a schematic I found online somewhere:
I hate to be the party pooper here, but I think no USB port will ever be able to deliver that much power. We're talking about several ampers here. Neither the USB port nor a great majority of USB cords are built to withstand that. Most USB hardware is designed to carry at most 1A. And then, even if you manage to get sufficient power flowing and power the contraption up, I wouldn't expect too much autonomy out of it since it is after all battery-powered. I would expect a lot of heat from the batteries too.
ravilov said:
I hate yo be the party pooper here, but I think no USB port will ever be able to deliver that much power. We're talking about several ampers here. Neither the USB port nor a great majority of USB cords are built to withstand that. Most USB hardware is designed to carry at most 1A. And then, even if you manage to get sufficient power flowing and power the contraption up, I wouldn't expect too much autonomy out of it since it is after all battery-powered. I would expect a lot of heat from the batteries too.
Click to expand...
Click to collapse
Battery Charging Specification 1.2:[14] Released in December 2010.
Several changes and increasing limits including allowing 1.5A on charging ports for unconfigured devices, allowing High Speed communication while having a current up to 1.5A and allowing a maximum current of 5A.
But as I said, I don't know if you can get that much power out of this particular device. 20W @12V is 1.6A, but in order to power that from 5V, you would need at least 4 Amps, which puts it close to max, but not over it. The actual port connector is rated much higher than that.
Edit: I do have to agree on one point though, running on battery power will be pretty limiting, especially when you consider what the batteries were intended for in the first place.
So you're saying that there's basically no way to power the Wii with the Lapdock's setup? Ughhh I assumed this would be a major issue but decided I'd leave the verdict to those more knowledgeable than myself... So you're sure there's no way? Well anyways I hope at the very least to connect an external Battery (recommendations?) and hopefully get the Wii to recognize the trackpad and keyboard esp. for linux use... I'll keep researching and keep you guys posted.
edit:
ravilov said:
I hate yo be the party pooper here, but I think no USB port will ever be able to deliver that much power. We're talking about several ampers here. Neither the USB port nor a great majority of USB cords are built to withstand that. Most USB hardware is designed to carry at most 1A. And then, even if you manage to get sufficient power flowing and power the contraption up, I wouldn't expect too much autonomy out of it since it is after all battery-powered. I would expect a lot of heat from the batteries too.
Click to expand...
Click to collapse
lehjr said:
Battery Charging Specification 1.2:[14] Released in December 2010.
Several changes and increasing limits including allowing 1.5A on charging ports for unconfigured devices, allowing High Speed communication while having a current up to 1.5A and allowing a maximum current of 5A.
But as I said, I don't know if you can get that much power out of this particular device. 20W @12V is 1.6A, but in order to power that from 5V, you would need at least 4 Amps, which puts it close to max, but not over it. The actual port connector is rated much higher than that.
Edit: I do have to agree on one point though, running on battery power will be pretty limiting, especially when you consider what the batteries were intended for in the first place.
Click to expand...
Click to collapse
Didn't read your reply Lehjr before posting mine, sorry about that. Anyways I'm still confused about whether or not the Lapdock is capable of powering the Wii? Anyways here is where I'm getting my very limited information on basic electronics. http://science.howstuffworks.com/environmental/energy/question501.htm I will keep studying though don't worry:laugh:
I'm going to post this on BenHeck Forums too for additional input. Again thank you guys.
lehjr said:
Battery Charging Specification 1.2:[14] Released in December 2010.
Several changes and increasing limits including allowing 1.5A on charging ports for unconfigured devices, allowing High Speed communication while having a current up to 1.5A and allowing a maximum current of 5A.
Click to expand...
Click to collapse
Hm, interesting. I don't know, I'd say even if the USB hardware might be able to withhold such high currents, it's only for a short while, not for continuous use. I'm talking about all the USB hardware now, not just the plugs and cords.
Anyway, while 5A might indeed be the theoretical maximum, I have yet to see an USB device that actually delivers anywhere close to that. Even most commercial "high-speed" chargers deliver only up to about 2A.
ravilov said:
Hm, interesting. I don't know, I'd say even if the USB hardware might be able to withhold such high currents, it's only for a short while, not for continuous use. I'm talking about all the USB hardware now, not just the plugs and cords.
Anyway, while 5A might indeed be the theoretical maximum, I have yet to see an USB device that actually delivers anywhere close to that. Even most commercial "high-speed" chargers deliver only up to about 2A.
Click to expand...
Click to collapse
Right, running that close to maximum is likely going to be short lived, and that's if it can be coaxed to go there in the first place. I'm not sure what the portability thing is about anyway. The Wii may be small, but it's heavy.
---------- Post added at 09:29 PM ---------- Previous post was at 09:19 PM ----------
jeffreygtab said:
So you're saying that there's basically no way to power the Wii with the Lapdock's setup? Ughhh I assumed this would be a major issue but decided I'd leave the verdict to those more knowledgeable than myself... So you're sure there's no way? Well anyways I hope at the very least to connect an external Battery (recommendations?) and hopefully get the Wii to recognize the trackpad and keyboard esp. for linux use... I'll keep researching and keep you guys posted.
edit:
Didn't read your reply Lehjr before posting mine, sorry about that. Anyways I'm still confused about whether or not the Lapdock is capable of powering the Wii? Anyways here is where I'm getting my very limited information on basic electronics. http://science.howstuffworks.com/environmental/energy/question501.htm I will keep studying though don't worry:laugh:
I'm going to post this on BenHeck Forums too for additional input. Again thank you guys.
Click to expand...
Click to collapse
Possibly capable, very slim chance, but doing so would be running very close to max the entire time the Wii is powered. You would also have to build a device to convert 5V to 12V, again, not impossible, but you do lose some power do to conversion inefficiencies. Is there any particular reason you want the device to be portable? IMHO, in order to run the Wii for any length of time, you would need a decent set of batteries. A few amps plugged in is one thing, on battery power that's something else. I could easily see you using something like a couple Power Wheels 6V batteries and a 12V charger or some similar setup, maybe some lithium cells if you're a big spender. Anything more than that and you're wheeling this thing around on a cart with a deep cycle marine/RV battery.
Haha a definitive answer would be welcomed as to whether it's theoretically capable or not, but if you can't provide that, I completely understand! Anyways It just needs to be temporarily portable, like 1 hour battery life is plenty. Thanks... Btw I can't actually thank you guys anymore as I'm out of thanks.
At CES 2013, a little USB device made a few headlines for being able to allow boosted amperage from a PC USB port in order to charge smart phones and many tablets at speeds close to or even faster than their OEM AC charge adapters. It's called ChargeDr, and here are a few info links:
Three things I saw at CES that I'd actually buy:
http://ces.cnet.com/8301-34450_1-57563158/three-things-i-saw-at-ces-that-id-actually-buy/
ChargeDr lets you charge your tablet from a laptop USB port:
http://ces.cnet.com/8301-34439_1-57...ou-charge-your-tablet-from-a-laptop-usb-port/
Digital Innovations ChargeDr USB Charge Booster:
http://www.digitalinnovations.com/chargedr-usb-charge-booster.html
Basically, ChargeDr takes the 5V output of a USB 2.0 (0.5A) or USB 3.0 (0.9A) port and 'requests' a 5V output of up to 2.1A. The power coming from the ChargeDr is then equivalent to an OEM AC charge adapter. Pretty nice for something that will sell for about $30 when it finally ships soon.
There are a few products (Chinese and South Korean) that are already on the market today that either join in on this technology or confuses consumers into thinking their product is the same.
Here's some of what I've seen:
Pisen USB Power Adapter Increases Amperage Converter:
http://www.ylmart.com/pisen-usb-power-aadptor-increases-amperage-converter.html
eBay
Costs between $4 U.S.D. to $5 U.S.D.
Descriptions for this item usually says something general such as 'increases the amperage to 2000 mAh.' Sounds good, right? Why pay $30 when you can get a device that seems similar for only $5 and usually with free shipping. Wrong. I made a just-to-see purchase of this Pisen device and was totally disappointed. When connected to a PC, it cuts off the USB data pins so that your device does think it's attached to an AC charger. With a Sprint Samsung Galaxy Note II as a test device, however, it took more than 8 hours to go from 18% power to 100% which is about normal when charging the phone through a PC USB port without the Pisen. Weak!
Fastar REUM Adptor:
http://www.ebay.com/itm/quick-charg...ader_Chargers_Sync_Cables&hash=item1e77e2afa9
Costs between $17 U.S.D to $22 U.S.D.
I am intrigued by another device called the REUM Adptor. It's a South Korean product, and it sounds like it does almost exactly the same as what the ChargeDr claims to do. The REUM is sold on eBay for about $22 U.S.D., but you can also get it through Amazon for $17. This one looks like the real deal, so it'll be my next just-to-see purchase. I have PCs around me all the time, and it's more convenient to charge my smart phone through them than with an AC charger. The Galaxy Note 2's OEM AC charger can fast charge the phone from almost no power to 100% in a little over 3 hours. I'm all for getting similar results through a PC USB port!
So if you're interested in such things, my advice is to stay away from those $5 adapters that only cut off the USB data pins. Wait for the ChargeDr or try the less-expensive REUM.
I hope some of you find this post helpful. Have fun!
I'm intrigued! Please write a review once you get REUM part; I'm especially curious how hot does it get. I mean it's basic EE to use transistor to increase the current (collector current is a multiple of the base current coming from usb port, with an appropriate circuit around it). But implementing it efficiently in such a small package without overheating it could be a challenge.
Here goes.....
I've been struggling with charging this phone. I have both the Sprint and the T-Mobile versions and I'm seeing the exact same thing with both. This phone simply won't pull more than roughly 600ma from a charger. With the best of equipment (chargers, cables, clean regulated power, etc etc) a ~600ma (+/- 15ma) is all I ever see at max draw.
I've tried all sorts of chargers including several stock. Same for cables. I know some cables are crappier than others and can restrict current....those that I found did that were tossed in the trash (don't want to keep the fubar cables anyway).
The phones have been in various states of operation too....from one extreme to the other....under heavy benchmark load to "first run" state from a complete reset (full wipe) with airplane mode on.
I said I have a "problem" above. What I mean is that my use-case is such that I use navigation with bluetooth streaming for podcast listening during my commute to and from work everyday. While I'm at work I plug into my TV to playback video podcasts via MHL. At best.....BEST....I can maintain my current state of battery. In other words, if I'm 39%, it'll stay thereabout when I'm plugged into a charger in either case.
No..."Power Saver" doesn't help. What I think would actually help is if power saver had the ability to disable some cores in addition to just governing to 1.1Ghz across all four.
To have a kickass phone that you really honestly can't truly kick ass with feels weird. This thing simply discharges faster than it can charge under any real world load. Maybe my personal use case is unconventional but I feel like it's not THAT unconventional seeing as that the features I use wouldn't be built in if no one ever used them, ya know?
I first noticed this behavior with my previous phone, the EVO 4G LTE. Even though the behavior was there, it wasn't as bad or noticeable due to what I think is the fact it was only dual core...maybe other factors too. But I'm not a developer/engineer so my observations are only from the outside looking in.
I know I can "tweak" my behaviors -- "...do this, or do that. Disable this, or disable that." I understand these things. But having to disable a bunch of things sorta goes against the idea of having this device in the first place.
At the end of day, my observations are my own and I know some are going to suggest I'm "holding it wrong" or whatever, but you guys gotta admit there's something up here with the very limited charge rate.
All that said, I still enjoy the phone. :angel:
PS - I've been using this to monitor current/voltage. The tool has been verified to be working properly by two EE's at my work.
http://www.amazon.com/Micro-SATA-Cables-Voltage-Current/dp/B005Z1E3IY
The stock charger. How much does that one show it pulls?
Because I have a 1a charger for my car and I can have the screen on the entire time and it charges, slower, but it charges.
Felnarion said:
The stock charger. How much does that one show it pulls?
Because I have a 1a charger for my car and I can have the screen on the entire time and it charges, slower, but it charges.
Click to expand...
Click to collapse
Same same.....doesn't matter the charger capability. Peak draw at any one time seems to be roughly 600ma...+/-. As the charge of the phone gets closer to full, the rate tapers off to roughly 200ma and then to about 80ma as it gets really really close.
I know HTC is trying to protect the battery, but I really get the feeling that this is way over-protective.
I took a picture with my Sprint ONE of my T-Mobile ONE on a 2.1amp wall charger. Notice, at roughly 1/2 charge, it's only pulling about 600ma. Too daggon slow in my opinion.
dougxd said:
Same same.....doesn't matter the charger capability. Peak draw at any one time seems to be roughly 600ma...+/-. As the charge of the phone gets closer to full, the rate tapers off to roughly 200ma and then to about 80ma as it gets really really close.
I know HTC is trying to protect the battery, but I really get the feeling that this is way over-protective.
I took a picture with my Sprint ONE of my T-Mobile ONE on a 2.1amp wall charger. Notice, at roughly 1/2 charge, it's only pulling about 600ma. Too daggon slow in my opinion.
Click to expand...
Click to collapse
I assume you mean a 2.1a charger from a Samsung product, that won't work. Samsung uses some signaling on the D+/D- wires to show "This is a Samsung product, charge at 2.1a"
HTC One does not have the ability to offer this signaling and thus will charge at usb rates of 500-600.
Some products short the data wires of the USB to signal that it can supply extra power. This is the type of signaling the HTC One can use. You would need a charger with this capability.
If you don't mind, if you could take a picture of the same setup with the stock charger, that would help.
EDIT: This particular device you've linked seems to block any signaling, according to the reviews on Amazon. I think your problem may lie there.
Felnarion said:
I assume you mean a 2.1a charger from a Samsung product, that won't work. Samsung uses some signaling on the D+/D- wires to show "This is a Samsung product, charge at 2.1a"
HTC One does not have the ability to offer this signaling and thus will charge at usb rates of 500-600.
Some products short the data wires of the USB to signal that it can supply extra power. This is the type of signaling the HTC One can use. You would need a charger with this capability.
If you don't mind, if you could take a picture of the same setup with the stock charger, that would help.
Click to expand...
Click to collapse
I understand what you mean. Note that in all my rambling on about various cables and chargers, if I wasn't explicit, I was implicit in that I've tried HTC stock gear too. Same results, no matter.
I do own and did try a few Samsung chargers and cables in addition to the myriad of others. I'm aware of Samsung's irritating attempts to lock people into using their accessories, thus the signaling modifications, but wanted to try them anyway. That's why I used a whole bunch of different chargers and cables. Some are not-so-good and others are great. The one charger I prefer is from VENTEV. It's a dual-port 4.2amp (2.1/per) wallwart.
http://www.amazon.com/Ventev-Wall-Charger-Dual-2-1A/dp/B00BEJSRDI
What I'm saying overall is that the big picture here suggests that we can't pull more than the peak 600ma or so charge rate, no matter what combination of doo-dads you toss at the phone. I'm all in if HTC has some super-secret special vapor rub one can use to charge faster, but even the stock charger they give us in the box which supports 1AMP doesn't deliver that since the phone itself doesn't pull more than what I've seen.
In the attached pics, the one of the charger is the Ventev. The other three show my T-mobile ONE just hit 90% charge and the rate has dipped to about 400ma on average. I took three snaps to show that it does fluctuate a bit. It'll ramp down more at about 95% or so and even more at 99%...................ALL of which is to be expected, I know.
To be clear, at this point, and what you see in these pics, is the stock HTC wall charger that we all get in the box with the phone, the stock HTC microUSB cable, and the measuring tool that is plugged into charger which the cable is then plugged into to then charge the phone. The meter can handle just over 2amps before it pops the internal fuse.
-Doug, KG3EK
dougxd said:
I understand what you mean. Note that in all my rambling on about various cables and chargers, if I wasn't explicit, I was implicit in that I've tried HTC stock gear too. Same results, no matter.
I do own and did try a few Samsung chargers and cables in addition to the myriad of others. I'm aware of Samsung's irritating attempts to lock people into using their accessories, thus the signaling modifications, but wanted to try them anyway. That's why I used a whole bunch of different chargers and cables. Some are not-so-good and others are great. The one charger I prefer is from VENTEV. It's a dual-port 4.2amp (2.1/per) wallwart.
http://www.amazon.com/Ventev-Wall-Charger-Dual-2-1A/dp/B00BEJSRDI
What I'm saying overall is that the big picture here suggests that we can't pull more than the peak 600ma or so charge rate, no matter what combination of doo-dads you toss at the phone. I'm all in if HTC has some super-secret special vapor rub one can use to charge faster, but even the stock charger they give us in the box which supports 1AMP doesn't deliver that since the phone itself doesn't pull more than what I've seen.
In the attached pics, the one of the charger is the Ventev. The other three show my T-mobile ONE just hit 90% charge and the rate has dipped to about 400ma on average. I took three snaps to show that it does fluctuate a bit. It'll ramp down more at about 95% or so and even more at 99%...................ALL of which is to be expected, I know.
To be clear, at this point, and what you see in these pics, is the stock HTC wall charger that we all get in the box with the phone, the stock HTC microUSB cable, and the measuring tool that is plugged into charger which the cable is then plugged into to then charge the phone. The meter can handle just over 2amps before it pops the internal fuse.
-Doug, KG3EK
Click to expand...
Click to collapse
I was directed here from another section of the forum. Have you tried using this?
https://play.google.com/store/apps/details?id=ccc71.bmw
With the stock charger it's telling me that I'm pulling in just under 950mA during peak charging and tapering off as it gets full.
i used a samsung 0,7C charger and it charges at a conntat +800 mah
the HTC chargers also sometime speaks a 900mah + but typically aroung 200 - 700 mah +
it seesm t fluctuate more often than the samsung which fluctuates to a lwoer range when bnearing 100% charge.
Dude, I have the same problem with my EVO LTE, exactly as you describe it, but for some weird reason, my car charger charges my phone normally while other chargers will take several hours to fully charge. I really hope it's a problem with the chargers and not our phones :thumbup: :thumbdown:
Sent from my EVO
Hi. So based on what people have posted, does this mean that there isn't much difference charging it from the wall socket and from a usb port in a computer? Since then charging rate is around 700-600mah and a usb charges at around 500mah
Sent from my HTC One using Tapatalk 4 Beta
http://androidcommunity.com/htc-one-doesnt-support-qualcomm-quick-charge-20130509/
So I download that app, and no matter what I do with my settings, I cannot get my phone to charger faster than about 550mAh. I am currently running at 384MHz CPU, 200MHz GPU, and Force Charge (On/Off), and 50mV undervolt. With the phone just laying there (screen on 65% battery), as I stare it currently saying +529mA (+23.00% /h).
I just tried playing a game, with those same exact settings and it stated charging at +217mah.
Edit: Was redirected here and didn't notice it was an old topic. sorry for reviving?
Hi,
Does anyone know why it is such a god damn crapshoot for charging speeds on the galaxy note 2 (or any samsung device for that matter).'
You buy a charger rated for 2amps and you never know what it will give you.
You buy a USB micro cable and get anywhere between 0.4amps and 1.6amps.
What is the criteria that the phone is using to determine how many amps to pull from the charger? How does it even know what gauge of wire it is? Is there some sort of resistance check?
I have a Galaxy Note 10.1 and that is even more particular than the GN2. With most aftermarket chargers, it absolutely refuses to charge. I've had so much trouble finding a charger for it I've just stopped using the tablet since I only have one working charger for it.
It really sucks spending anywhere between $2-$30 dollars on a charger and not knowing if it will work. My success rate has been less than 10%.
I try to do forum and google searches, but all I seem to find are comments like "I bought this charger. Seems to work." With no detailed information on what performance they are getting out of it.
This is really turning me off samsung products. I don't have this problem with my HTC or LG android devices.
I don't know why you have problems, I have 2 samsung devices (phones) and I chare them with their original chargers, charger from Nexus 7 and my old charger from Desire HD and all work just fine... ofcourse, the original one is the fastest, since it is 2A, HTC one is 750mA and Ativ S one is 500mA, N7 one is 1A....
dalanik said:
I don't know why you have problems,
Click to expand...
Click to collapse
You kind of answer this for yourself, as follows:
dalanik said:
I have 2 samsung devices (phones) and I chare them with their original chargers, charger from Nexus 7 and my old charger from Desire HD and all work just fine... ofcourse, the original one is the fastest, since it is 2A, HTC one is 750mA and Ativ S one is 500mA, N7 one is 1A....
Click to expand...
Click to collapse
Your stock charger is 2A. (about 1.5-2 hours to charge)
Your HTC charger is 750mA (about 4-5 hours to charge)
Your ATV charger is 500mA (no better than a computer port. 6-8 hours to charge)
This was my point. This IS my problem. Obviously the stock charger works at 2A, but with any other charger it is anyones guess as to what speeds you're going to get out of it. Even when they are specified to work at 2A, you are likely not going to get 2A out of it. The phone is so bloody fickle.
If there was some benchmark or specific set of criteria I could use when I purchase a new charger to know for certain if it will charge at 2A, then that would mitigate some of the problem at least. But right now, there is none as far as I can tell. When I purchase a charger, I literally have no idea if it will run at 2A with this phone.
I'm glad that you're not bothered by the slow charging speeds and are happy with <1A. I'm sure this works well for most people. It doesn't for me. I push my phone to the max (as I have every right to) and need a charger that can keep up.
Well, charging slowly is different to what you say i.e. "refusing to charge at all" etc. And of course I don't use Ativ's charger to charge N2 often, it would take ages But I use HTC's charger that is 750mA and it charges within 2 hours which is OK.
Anyways, the only solution for you is to buy BRANDED charger from a company you can trust not some cheap chinese, t should work just fine whether it gives 2A or 1.9A is really no big difference.
dalanik said:
Well, charging slowly is different to what you say i.e. "refusing to charge at all" etc. And of course I don't use Ativ's charger to charge N2 often, it would take ages But I use HTC's charger that is 750mA and it charges within 2 hours which is OK.
Anyways, the only solution for you is to buy BRANDED charger from a company you can trust not some cheap chinese, t should work just fine whether it gives 2A or 1.9A is really no big difference.
Click to expand...
Click to collapse
Well, there are chargers that refuse to work. Especially with the Galaxy Note 10.1.
Cheap ebay chargers are a crapshoot, that much goes without saying. But there are many brand-name chargers that don't work at full speed, despite being rated for 2A.
Its not so much a charger thing as a samsung thing. While I'm not able to find specific criteria as to how/why the phone decides to charge at the speed it does (which is really the only question I had with this thread), I can tell you that there are many brand name products (monoprice, anker, ngear, etc) that are rated for 2A, but will not run at 2A with the samsung. They will usually run at 2A with other products though.
The more research I do, the more I highly suspect that this is a case of Samsung propriety. It looks like that they are deliberately throttling aftermarket chargers to force you to buy their overpriced samsung chargers. As I understand it, it has something to do with creating a voltage divider between two of the contacts, but every diagram I find shows a different wiring scheme. This would indicate that no one really knows for sure.
The one and ONLY question I have with this thread is to find out what criteria the N7100 uses to determine how much amperage to draw. I remain confident that no one will answer this question because it seems no one knows.
For the Note 2: there is a way to get a simple measurement of how much current is being pulled. Refer to this thread for the apk and more info.
alpha-niner64 said:
For the Note 2: there is a way to get a simple measurement of how much current is being pulled. Refer to this thread for the apk and more info.
Click to expand...
Click to collapse
Thanks for posting this. I suppose I should have mentioned that I have this app already and it is incredibly useful. I also have this, which with only a few bizarre exceptions, reports the same as the app.
The more people who are aware of this app, the better. People who think that their aftermarket charger "works fine for me" are probably unaware of how much those chargers are under-performing.
I don't have the education to explain your situation your situation well. But it boils down the electrical engineering the the physics of electricity.
There are quite a few variables that all effect the charging of devices. First thing is what does the device require for charge input, which is both amps and voltage. For whatever reason, quite a few tablets require 15 volts versus 5, which is what most mobile phones need. I have this same issue with my ASUS Transformer Infinity pad. It requires 2.0amps with 15 volts. I have a Galaxy Note II with needs 2.0amps with 5 volts. Unfortunately, when I use my phone charger with the tablet, it puts out enough to trigger a charger is plugged in (turns on tablet if it is off), but not enough to trigger there is actual charging. It does charge it, but it's a trickle charge; basically if it using while plugged in, it only slows the battery depletion rate.
As for the charge output, now you're getting into build quality, resistance of the components of the charger itself and the USB cable being used.
And then depending the device, the pins used on the USB cable can have an effect too. This mostly occurs with tablets or proprietary cables because the pins may tell the hardware what kind of charger is being used, which may have built in limits for charging.
Hopefully that helps some.
lovekeiiy said:
I don't have the education to explain your situation your situation well. But it boils down the electrical engineering the the physics of electricity.
There are quite a few variables that all effect the charging of devices. First thing is what does the device require for charge input, which is both amps and voltage. For whatever reason, quite a few tablets require 15 volts versus 5, which is what most mobile phones need. I have this same issue with my ASUS Transformer Infinity pad. It requires 2.0amps with 15 volts. I have a Galaxy Note II with needs 2.0amps with 5 volts. Unfortunately, when I use my phone charger with the tablet, it puts out enough to trigger a charger is plugged in (turns on tablet if it is off), but not enough to trigger there is actual charging. It does charge it, but it's a trickle charge; basically if it using while plugged in, it only slows the battery depletion rate.
As for the charge output, now you're getting into build quality, resistance of the components of the charger itself and the USB cable being used.
And then depending the device, the pins used on the USB cable can have an effect too. This mostly occurs with tablets or proprietary cables because the pins may tell the hardware what kind of charger is being used, which may have built in limits for charging.
Hopefully that helps some.
Click to expand...
Click to collapse
That last part is correct. I actually know enough about electrical circuits to be pretty sure it is the phone deciding how much power to pull.
Ohms law states that the amperage of a circuit is the voltage of the circuit devided by the resistance (in ohms).
USB circuits are almost universally 5 volts. I remember reading somewhere that a phone has a potentiometer that protects it from circuits of incorrect voltage, up to a certain amount. This is probably why you can get away with sticking a 15V charger onto your phone and not blowing it up. You cannot depend on this however. Generally, you do not want to stick a charger into your device that is a different voltage rating than what the charger is rated for.
The charger decides the voltage, using an internal device that changes AC voltage (120VAC if youre american) to 5VDC (USB) or whatever your device needs. This device is called a rectifier.
As stated above, the charger decides the voltage. The battery determines the resistance*, therefore the amperage is the natural result of deviding the voltage by the resistance.
*Resistance is added to the circuit by the wire and the charger itself, but is usually inconsequential.
When a charger says that it is rated for a certain amperage, that means that it is the maximum amount of current that thr internal components can handle safely, without running the risk of earth-shattering kabooms (fire). If the circuit you have connected to your charger contains too little resistance, you will increase the amperage (ohms law, as stated above), and you may end up with a piece of charcoal where your charger used to be.
Thr fact that the samaung phones can change the amperage of a charging circuit so fickly must mean the phone is capable of changing its resistance. So the question becomes, what criteria is it using to determine when to change the resistancr and to what?
-PW
This may be the longest thing ive ever typed on my phone.
I'm not disagreeing since, as you said, the mobile device manufactures have build in some safe guards so we don't fry them from incorrect chargers or over charging.
But there are charges that are 15v. I've have one that came with my ASUS Transformer Infinity Pad. I think many Samsung tablets are in the same boat. I don't recall using that charger on any of my smartphones; if I have, it's only been once or twice, but good possibility I may never have. But as stated earlier, I have used my phone chargers on the tablet, but only does a trickle charge. That tablet has some wide input plug at the end of the USB cord. I'm thinking one of the pins must not get enough power to trigger the full charge. Yet, if I use my Anker external battery, set it to 15v, and a few adapters, it triggers the normal charge cycle.
Don't forget,phones such as Galaxy Note 2, Galaxy S3, use 11pin microUSB ports versus the standard 5. I have no idea what all pins do or trigger, My assumption, part of your answer why the charge output varies lies with how they're use the other pins. I know quite a few tablets have more than 5 pins since the USB port is some wide thing; the ASUS does because it carries data and power for the separate keyboard that can be attached to be a suedo-laptop that has USB ports, battery and full 104 key keyboard; I don't recall what other ports the attachment may have.
I still hold that part of charge difference is also the USB cord itself since difference materials have difference resistance. It may not be as significant as the charger itself, but I've seen significant differences in charging times or depletion rates (around 10% battery per hour) using MHL adapters purely on the USB cables.
Yes, typing out long replies on the phone's virtual keyboard blows monkey chunks. Thus, I use a blue tooth keyboard instead for those situations. I also have a blue tooth mouse, LOL.
Will this charger from Huawei fast charge my Note 9? It is the only legit one i can buy locally.
https://www.amazon.co.uk/Huawei-Sup...06XQTYRC4/ref=cm_cr_arp_d_product_top?ie=UTF8
@DarthJe5us
Great question.
With my experience with Samsung and Fast Charging, it appears to work under two conditions in my vehicles:
If use my official Samsung Car charger and cables in conjunction with my device, it works well:
https://www.amazon.co.uk/Genuine-Sa...553361843&sr=1-5&keywords=samsung+car+charger
But if I use third party cables and a third-party charger that supports to USB C to USB C, it works just as efficiently.
Here's what I'm using currently:
https://www.amazon.com/Blackweb-Cha...&qid=1553362594&s=electronics&sr=1-1-fkmrnull
Obviously this brand is not available in the UK but here is something similar:
https://www.amazon.co.uk/Belkin-Cha...504&sr=1-46&keywords=Car+Charger+USB+C+huawei
AHE_XDA said:
@DarthJe5us
Great question.
With my experience with Samsung and Fast Charging, it appears to work under two conditions in my vehicles:
If use my official Samsung Car charger and cables in conjunction with my device, it works well:
https://www.amazon.co.uk/Genuine-Sa...553361843&sr=1-5&keywords=samsung+car+charger
But if I use third party cables and a third-party charger that supports to USB C to USB C, it works just as efficiently.
Here's what I'm using currently:
https://www.amazon.com/Blackweb-Cha...&qid=1553362594&s=electronics&sr=1-1-fkmrnull
Obviously this brand is not available in the UK but here is something similar:
https://www.amazon.co.uk/Belkin-Cha...504&sr=1-46&keywords=Car+Charger+USB+C+huawei
Click to expand...
Click to collapse
I think i can find the samsung car charger here but i was interested in getting the huawei so i don't have to buy another one in the future if i change my phone. I would like to know if it works with it before i buy it since it is 15 euros here.
I can only theorize based on the amperes and voltage. It gets difficult afterwards and I'll explain.
Samsung's 'Adaptive Fast Charging' provides, under optimal conditions, 1.7A to equate roughly 15.3 watts.
Huawei AP38's 'SuperCharge' provides, under optimal conditions, 5A at 4.5V to equate roughly 22.5 watts.
So, based on the numbers alone, we're left to theorize that the AP38 'should' be the best solution.
So will the 'Huawei AP38' provide your Note 9 with the same charge that a 'SuperCharge' enabled device enjoys, likely not.
Will it charge faster? Absolutely. Will that charge be equivalent to a Samsung-branded one? No.
A series of competing charging standards exist ('Adaptive Fast Charging'/'SuperCharge'/'VOOC'/'QuickCharge') and they all use different technologies to deliver MORE voltage IF the charger and device are certified to work together.
So, at day's end, use high quality cables alongside your Huawei AP38 and you'll enjoy faster charge speeds.
Just NOT as fast as advertised.
https://www.iottie.com/Product/Detail/5064/Easy-One-Touch-4-Wireless-Fast-Charge-Qi-Mount-_Online_