270 Degree Servos - Android Studio

I am apart of Ftc team 535 and i have a question i cant seem to get answered anywhere. We recently purchased two 270 degree servos from dfrobot and i cannot get them to work in their full range. If i declare them as a standard servo object i can reach angles from 0 to 180. I was inside of the index and i found code for pwm controllers but i could not get them implemented for the life of me. If i could get some help it would greatly be appreciated.

Related

Possible new Idea for Sensor-App: Augmented Reality

Hi,
this is a possible new idea for a sensor-app.
I don't know if the accelerator/position sensors are accurate enough to track and compute movement. But let's assume, that you can put the phone onto one spot, then move it around in the room and return it to that spot and then the software calculates the movement from the sensors and sees that it has returned to that starting point.
Given the sensors are exact enough, you can simulate a 3D Environment, where you can move the phone around and use the screen to look at the 3D environment. Moving around will change the view accordingly. Possibly add a camera-live-view and you can simulate virtual objects in the room.
Anyone ?
Magnus
From my (limited) experience programming to the sensors, they are certainly sensitive enough for this. I suppose one difficulty would be knowing where the Diamond was being held in relation to the body: if the user waves the device around that might screw up the calculation of the virtual position of the user's body in the virtual space?
Great idea!
The accelerometer registers changes of position regarding a inherent Y-X-Z axis. It cannot track movement in space only orientation of axis. So this will not work.
SamLowrie111 said:
Hi,
this is a possible new idea for a sensor-app.
I don't know if the accelerator/position sensors are accurate enough to track and compute movement. But let's assume, that you can put the phone onto one spot, then move it around in the room and return it to that spot and then the software calculates the movement from the sensors and sees that it has returned to that starting point.
Given the sensors are exact enough, you can simulate a 3D Environment, where you can move the phone around and use the screen to look at the 3D environment. Moving around will change the view accordingly. Possibly add a camera-live-view and you can simulate virtual objects in the room.
Anyone ?
Magnus
Click to expand...
Click to collapse
Android will have this in Google maps - have you seen the vid but it has to use the built in compass
Jorlin said:
The accelerometer registers changes of position regarding a inherent Y-X-Z axis. It cannot track movement in space only orientation of axis. So this will not work.
Click to expand...
Click to collapse
Good point, but moving around a room would involve some acceleration of the device, which it would register, right?
Ofcource, the amount of acceleration can be used to calculate the distance moved.
When moving 1 meter (slowly, say 6 seconds) the sensor would give this output somewhat:
seconds 1 2 3 4 5 6
output in 1/10th G 2 1 0 0 -1 -2
Jorlin said:
The accelerometer registers changes of position regarding a inherent Y-X-Z axis. It cannot track movement in space only orientation of axis. So this will not work.
Click to expand...
Click to collapse
Actually, it registers acceleration which is change of velocity, not position. If you move it at a constant speed, it won't register anything (apart from the gravitational acceleration that is always present of course).
Riel said:
Ofcource, the amount of acceleration can be used to calculate the distance moved.
When moving 1 meter (slowly, say 6 seconds) the sensor would give this output somewhat:
seconds 1 2 3 4 5 6
output in 1/10th G 2 1 0 0 -1 -2
Click to expand...
Click to collapse
You can integrate a(t) to get v(t) and then integrate v(t) to get s(t), but you would need to know the boundary conditions v(0) and s(0) i.e initial velocity and position. If you are moving it from a standstill then of course the velocity equals zero. For a distance in a local coordinate system you can of course set s(0) = 0.
Also given the imprecision of the sensor, you couldn't count on too much accuracy and you would be bothered by the constant -g acceleration for which you would need to compensate. This of course would be difficult if you did not hold the phone perpendicular to the ground when performing an accelerated motion.
You can filter out ALL Z-axis g's for vertical movement.
Tilting the phone would only result in slower or faster movement in the augmented reality then.
Placing an 3d arrow pointing straight in front of you will then always result in the augmented realtity the corresponding direction, if you hold the phone diagonal, the arrow points there, and so is the movement in A.R.

touchscreen to measure mass?

if the touchscreen is pressure sensitive on the xperia of course. does anyone think it would be possible to code a program to measure that pressure in mass?
i think it would be so sick to use the xperia as a scale
It can not measure mass. Any more pressure will break the screen. Use your common sense!
its not a matter of common sense.
what do you mean anymore pressure would break the screen?
im sure by slightly touching my screen im putting less than a gram of pressure per sq. inch on the screen so im not sure what your talking about because im not going to measure a boulder on the thing, use your common sense..
Yes, it's definitely possible and would not be too hard to code. One way would be to define a measurement area ("scale") on the screen then gradually increase sensitivity settings (via a program, of course) in the registry until a touch was registered in that area. Initially, the registry values would need to be calibrated against a set of small weights (up to a reasonable weight, of course). Anyone got their high school physics weights?
But yo, what would be the good of a tiny xperia scale?
Y'all are crazy.
i have installed your soft touch on my xperia and i love it, where would the registry settings be found to change the sensitivity?
I think it's a really interesting idea to test.
I think it can be done since X1's touchscreen is resistive so it will be able to sense graduations in changing pressure.
Had you had an IPhone, it's capacitative screen would made this impossible.
The thing is, working with registry settings won't do the trick in my opinion. I think you need something more low level (like a driver maybe) to talk directly to the touchscreen.
If I were you I'd go and check the WM 6.1 SDK and see what it makes visible thru its API for the touchscreen part.
It would be worth investigating how the driver accesses the touchscreen hardware.
I'd be happy to try and help with the programming btw
It's the fingerpressure registry setting that changes it.
But storm' is right. I forgot that those registry settings don't take effect until a reset, so you'd need another method to either dynamically change the sensitivity or capture the value of the pressure as it is being applied.
ok, thanks storm seeing as this would be my first ever program to code I would really appreciate the help..
I was looking at the SDK site last night but didnt quite know what i was looking for, but now ill research the touchscreen driver(s) and how they are accessed by the phone and how we can use them to our benefit.
there is a touch.dll file in the windows folder im wondering if this registers the pressure applied..
Sweet,
I'm also gonna investigate more
Keep you posted
hmmm, i guess the first step would be to create a program that accesses the touch.dll to see if it records pressure applied?
3 guesses as to what you guys want to use this for
SamAsQ said:
3 guesses as to what you guys want to use this for
Click to expand...
Click to collapse
LOL They'd be better off with a Touch Pro. Don't want evidence getting under that recessed screen
e: Bloody great idea though. i'm not sure how it'll really work or how accurate it'll be... An object placed on the screen might have multiple contact points, and as the screen cannot detect multiple points pressure from the weight might be exerted elsewhere on the screen and not detected.
squidgyb said:
lol :d they'd be better off with a touch pro. Don't want evidence getting under that recessed screen
Click to expand...
Click to collapse
hahahahaahahahahahahahahahah :d:d:d:d:d:d:d:d
SquidgyB said:
LOL They'd be better off with a Touch Pro. Don't want evidence getting under that recessed screen
e: Bloody great idea though. i'm not sure how it'll really work or how accurate it'll be... An object placed on the screen might have multiple contact points, and as the screen cannot detect multiple points pressure from the weight might be exerted elsewhere on the screen and not detected.
Click to expand...
Click to collapse
true squidgy, but,
ok but you know on the fish panel?
i can place four fingers on the screen and they will find the exact center lift one finger up they will find the exact center of the three remaining fingers etc etc maybe this can help us in our mission..
so say you have a nice beautiful green flower that is making contact at three seperate points on the screen maybe we can incorporate what is going on in the fish panel to find the center and compare the pressure applied that the touch.dll hopefully will give us, and that we hope to figure by placing weights on the screen
I don't think the SDK will help us in our pursuit... I think it only gives back X,Y pairs...
We'd have to get pretty low level on this one.
The thing is, in theory its actually do-able.
http://www.scribd.com/doc/12804586/fourwire-resistivetype-touch-screen-with-usb-interface
This guy built its own drawing "board" by using a resistive touchscreen. The interesting thing is that he provides two methods of actually calculating the touch resistance which means that
1) it's possible to use it as a balance because the resistance would be dependent on the pressure, and the pressure depends on the mass in our case
2) it doesn't matter how many points you have... There's only one Rtouch so this means it calculates the overall pressure that is exerted onto the touchscreen. Even though you can only determine one X,Y pair...that's of no interest to us...
All this to say that in theory this is actually possible...Only problem is how to access the hardware...
At least this is my take on this, but I might be wrong
dbl post..
stormlv said:
1) it's possible to use it as a balance because the resistance would be dependent on the pressure, and the pressure depends on the mass in our case
2) it doesn't matter how many points you have... There's only one Rtouch so this means it calculates the overall pressure that is exerted onto the touchscreen. Even though you can only determine one X,Y pair...that's of no interest to us...
All this to say that in theory this is actually possible...Only problem is how to access the hardware...
At least this is my take on this, but I might be wrong
Click to expand...
Click to collapse
nice research
cool thats what i was thinking about the screen, but squidgy's thinking seemed logical, but if the resistive screen already calculates the overall pressure thats perfect..
ok so now we know that it is "theoratically" possible we just gotta get to action
im gonna be in vegas this whole weekend so ill try to update my progress when i can
Tool for resistance measurement
If the touchscreen panel that you want to measure is resistive you can measure it's force/displacement and resistance using one of the switch testers offered by a company called TRICOR Systems.
The touchscreen would have to use resistive technology in order to measure the resistance. Most of the newer touchscreens use either capacitive or surface acoustic wave technology.

possible graphics performance enhancements

Hi,
I spent the last couple of weekends trying to improve graphics performance on the nook color. I had two approaches and got stuck on both, but I thought I'd write them down here in case anybody else can figure out where I was going wrong.
16BPP mode:
The panel supports native 16bpp as far as I can tell (it seems that u-boot drives it in 16bpp). I tried disabling the CONFIG_FB_OMAP2_32_BPP kernel option, and changing the numbers in panel-boxer.c and board-3621_boxer.c to request 16bpp mode. The kernel compiles and boots fine and does draw output on the LCD (and surfaceflinger on boot reports 565 16bpp mode), however the LCD was being turned off any time there was no animation running with an error in the kernel log about a dispc GFX FIFO Underflow. I spent some time trying to figure how the FIFO high/low watermarks were calculated but wasn't able to get reliable output. There shouldn't be any underflow in 16bpp mode, as it ought to be using less memory bandwidth than 32bpp mode to scan out to the LCD...
USE_COMPOSITION_BYPASS:
Last year it looks like some work was done for the Samsung C110 SoC (as used in the Nexus S, etc) to allow fullscreen OpenGL apps to draw directly to the framebuffer instead of drawing to a surface, and having surfaceflinger then copy that surface to the framebuffer. This could be a huge graphics performance win as it avoids a fullscreen copy every frame. I tried enabling it in a CM7 build (and fixed the couple of compile errors relating to the changed surface locking semantics -- basically just remove unlockClients() and add hw.compositionComplete()) however as soon as SurfaceFlinger determined that a Surface could be given the framebuffer the client of that surface died. I found that the GraphicBuffer allocated in Layer.cpp:381 had an fd of -1, which binder was refusing to send over to the client because -1 isn't a valid fd. This fd comes back out of gralloc.omap3.so, which is closed source.
I'm not sure why the fd is -1, since it seems the buffer was allocated successfully -- I guess there's no reason to have a valid fd for a framebuffer-backed GraphicBuffer normally because they always stay in surfaceflinger. I thought of maybe just opening /dev/graphics/fb0 and using that fd, but didn't have time to try it.
Anyway, that's as far as I got on those two ideas before I ran out of time. Hopefully somebody else can pick up where I left off. The composition bypass is probably the best optimization if it works, because it's essentially free and would cause no decrease in graphics quality (like 16bpp mode would). It would only benefit fullscreen (no system bar) OpenGL apps (like most games).
Also, I would have posted in Android Development, but I haven't posted 10 times yet...

[Q] Improving x8gesture

I am actually hoping to speak with doixah directly, but I'm a newbie so I can't post in that thread, but I do hope doixah reconsiders his position in this regard, the thought I have about improving the x8 gesture is focused on improving the fake dual touch capability of the phone, I was wondering if doixah can do this:
assuming that we only have one sensor for the finger, but every time we press a specific area at a one by one manner the phone instantly recognizes it right? so why not place a loop in every instance where dual touch is required? the thought is to recognize both fingers in a threaded way, this is because no two fingers can be at the same position at the same time so every time we switch from one finger to another the coordinate is passed as if there are two fingers...
since both ends are always recognized in the dual touch modes why not just swap and retain each coordinate that is not exactly or partially the same with each other?
a pseudocode for the idea that i want to impart
coor = finger1.coor
coor2 = finger1.coor
while screen is touched{
coor = finger1.coor (1st end)
delay(1ms);
if(finger1.coor(2nd end)!=finger1.coor(1st end))
coor2 = finger1.coor(2nd end)
}
I am assuming that for every delay a new coordinate is scanned so coor and coor2 is bound to acquire different coordinates which will be fed to the android os thus a dual touch can be simulated by a single touch, hopefully doixah notices this or someone is kind enough to forward the idea to doixah
its doixanh .
Sent from my X8 using Tapatalk
sorry, don't mean to disrespect, hello, please somebody help me out with doixahn, I do believe this is the safest way to simulate dual touch with one fingerprint, it just needs the proper arguments... hellllllppppp
sorry... help doixanh...
Hi, I don't mean to be disrespectful, but if you wanted DX's help, you should have written him a PM, but DX himself stated, that he doesn't need/want to improve DT on x8, therefore he is probably more interested in froyobread development, than in DT improving.
As for the idea, I don't think, this would be useful. Gestures for zoom work right, only think, that could be better is IMHO game mode, so even if the DT simulation worked, it wouldn't be very effective and it would cost a lot of performance, so the games would be unplayable with it. Also I don't think that it could ever be done this way, because even if you swapped the coordinates you wouldn't get two fingers, you would have one spot pressed + one finger movement at the time, which would be very ineffectively written code for just a single touch.
I don't say, I'm right, it's just my opinion, if someone managed to do this, I would be happy as hell.
Mr. Hat said:
Hi, I don't mean to be disrespectful, but if you wanted DX's help, you should have written him a PM, but DX himself stated, that he doesn't need/want to improve DT on x8, therefore he is probably more interested in froyobread development, than in DT improving.
As for the idea, I don't think, this would be useful. Gestures for zoom work right, only think, that could be better is IMHO game mode, so even if the DT simulation worked, it wouldn't be very effective and it would cost a lot of performance, so the games would be unplayable with it. Also I don't think that it could ever be done this way, because even if you swapped the coordinates you wouldn't get two fingers, you would have one spot pressed + one finger movement at the time, which would be very ineffectively written code for just a single touch.
I don't say, I'm right, it's just my opinion, if someone managed to do this, I would be happy as hell.
Click to expand...
Click to collapse
Yes but this fingers would swith so this will stop one finger on 1 ms and allow to move other finger for 1 ms then againg and again and then this will look like real dual-touch.
Yes i get that, but how would you know, which one is moving at the time, correct me if I'm wrong, but when there are already two spots pressed and the coordinates are already swapped, you can't move the first finger, you would have to rise one finger and the press the screen again, because there is only one touch recognized. To make it work this way, we would need DT digitizer, this is not possible to make on single touch screen. The 1ms switching is useless, you can't recognize if the moving finger is the first or the second one, you would only knew, that one of them is moving.
well for example in dual touch games...
we normally have two sticks placed at ideal positions right? so suppose this is the bottom part of your screen and your controls are placed at the bottom like in figure 1 in my attached files
if we swap coordinates between the two points and detect the presence of the finger within a prescribed range, imagine the asterisks as finger 1 and finger 2 and the 0 as their ranges, ideally even if we only have a single touch since we have a prescribed dimension where a touch must be made, two distinct coordinates can always be extracted from a single touch screen within an interval, another illusion would be our hand ideally touching a side of the screen, this is for none game mode, though i doubt that there is a need to improve it,
imagine this scenario; figure 2 in my attached files
since ideally we are holding the phone at two different places at a time, imagine both our thumbs holding a position near each asterisks, we can move it around within the rectangular boundaries and swap coordinates relative to the asterisk, again in every interval, a different coordinate must be swapped, so in order to specify which is which we just have to set boundaries and relative positions to identify that this finger is holding coordinate 1 and the other is holding coordinate 2.
the only real limitation i see is one a finger breaches the region specified for dual touch, other than that I do believe that this kind of illusion is more than enough for most applications
well this is a pseudocode for my figure 1 illusion, by default we are going to include ranges, i will represent it with arrays
0 1 2 3 4 6 7 8 9
10 11 12 13 14 15 16 17 18
19 20 21 22 23 24 25 26 27
suppose our boundaries for finger 1 are 0 1 2 10 12 19 20 21 and the center is
11, then for finger 2 the boundaries are 7 8 9 16 18 25 26 27 and the center is 17
getOtherFinger(currentPos){
return !currentPos
}
getFinger(fingerPos, otherFingerPos, BoundaryCoor){
tempPos = null
if(withinBoundary(fingerPos)==true)
tempPos = fingerPos
else if(withinBoundar(fingerPos)
tempPos = otherFingerPos
return tempPos
}
detectFingers(){
boundaryCoor1, boundaryCoor2
while(screenIsTouched){
if(there are two fingers){
fingerpos = getOtherFinger(fingerPos)
delay()
fingerpos2 = getOtherFinger(fingerPos)
delay()
finger1 = getFinger(fingerPos, fingerPos2, boundary1)
finger2 = getFinger(fingerPos, fingerPos2, boundary2)
}
}
}
I'm not an expert and my idea is only a bit near to what i am trying to achieve, but at least hey i'm trying to visualize so folks help me out, I know there are a lot of programmers out there with insane skills when it comes to this
Yes i get that, but how would you know, which one is moving at the time, correct me if I'm wrong, but when there are already two spots pressed and the coordinates are already swapped, you can't move the first finger, you would have to rise one finger and the press the screen again, because there is only one touch recognized. To make it work this way, we would need DT digitizer, this is not possible to make on single touch screen. The 1ms switching is useless, you can't recognize if the moving finger is the first or the second one, you would only knew, that one of them is moving.
sir, that is why we need to use a relative post and a boundary, so that every time we swap to extract the coordinate of a finger, using a post or a central position and a boundary we have a basis for its movement and which finger is 1 or 2
plz check this one
http://forum.xda-developers.com/showthread.php?t=1158173
Primark said:
Yes i get that, but how would you know, which one is moving at the time, correct me if I'm wrong, but when there are already two spots pressed and the coordinates are already swapped, you can't move the first finger, you would have to rise one finger and the press the screen again, because there is only one touch recognized. To make it work this way, we would need DT digitizer, this is not possible to make on single touch screen. The 1ms switching is useless, you can't recognize if the moving finger is the first or the second one, you would only knew, that one of them is moving.
sir, that is why we need to use a relative post and a boundary, so that every time we swap to extract the coordinate of a finger, using a post or a central position and a boundary we have a basis for its movement and which finger is 1 or 2
Click to expand...
Click to collapse
Your idea is great actually but idk if its possible to make maybe that's how the Nokia n8 digitizer work to simulate multitouch! But for now the x8gesture game mood is not bad but its not accurate around the edges of the screen if some Dec can look into your idea or improve the x8gesture this will make lots of people's days and make this phone alot better....thanks for reading that long
Sent from my X8 using XDA Premium App
sir skyboyextreme, perhaps some of your friends can relay the idea to the experts, I think some of our developers can at least test if my thought is possible, nothing to lose here right, besides it's for the improvement of our beloved x8
sir skyboyextreme, can you please elaborate me about the n8 digitizer, i would like to know some details, you said it simulates multitouch, does that mean it only has a single touch digitizer just like our x8 or it has dual touch capabilities that is simulating multitouch functions
Primark said:
sir skyboyextreme, can you please elaborate me about the n8 digitizer, i would like to know some details, you said it simulates multitouch, does that mean it only has a single touch digitizer just like our x8 or it has dual touch capabilities that is simulating multitouch functions
Click to expand...
Click to collapse
From what I read, N8 has the same digitizer like x8 (synaptics) so it has a Single Touch Digitizer, but emulates dual touch.. Well again that's what I read
Yeaah it's a veru good idea,Primark. I read a thread where a guy said that the DEVs are already working on it. Is it true ??
Sent from my X8 using XDA App
Sir Kimpoy1994, If what you say about n8 is true sir, then perhaps someone could acquire the code for the touch panel of n8, someone could probably recode it to work for x8, that is probably our best shot for a dual touch emulation
Primark said:
sir skyboyextreme, can you please elaborate me about the n8 digitizer, i would like to know some details, you said it simulates multitouch, does that mean it only has a single touch digitizer just like our x8 or it has dual touch capabilities that is simulating multitouch functions
Click to expand...
Click to collapse
nokia n8 already got DT and it's confirmed having the same digitizer t1021a....so if they could simulate DT events using a single touch digitizer similar to ours then i guess we can achieve the same as they did but it's gonna be hard i guess and since there is no body interested into looking at this then i guess we are outta luck at least for now
What?? I thought that DEVs were interested by improving this fantastic smartphone. I mean they develop the android system,but I think that X8 Synaptic owners ( 50% of x8 owners i think ^^) would be so happy.If only they developped that DT, I think they would have completed the "biggest" defy of the X8
.....sorry for bad English, from France
Sent from my X8 using XDA App
sir skyboextreme, then I guess our only option is to make some noise and hope somebody hears us, the key to our dt is n8, sir doixahn, I do hope your reading this, since you were the one who started the gesture I believe you have the best shot in creating it for us
How can we make some noise ?? I think we just need doixanh or someone who knows him...
Sent from my X8 using XDA App

Screen colors different betwee A & B

Helloo got an Axon M but the 1st screen A has is a bit different color and tone from screen B.
Is there a fix?
Thanks
ARZLEB said:
Helloo got an Axon M but the 1st screen A has is a bit different color and tone from screen B.
Is there a fix?
Thanks
Click to expand...
Click to collapse
I've noticed this as well and the screens are a bit offset too. I've read that these issues were a deal-breaker for some and the devices were returned. If I find anything I'll post here.
GJSmith3rd said:
I've noticed this as well and the screens are a bit offset too. I've read that these issues were a deal-breaker for some and the devices were returned. If I find anything I'll post here.
Click to expand...
Click to collapse
Very true the screens are offset too thank you for pointing that.
i think their quality control only tested screens at certain brightness levels because above half brightness the screens have the same tone and brightness but below 50% one is clearly dimmer. and night mode with dual screen mode is pretty bad because the screens don't display the same color temperature at low brightness. and the fact that they didn't try and work around the slight screen offset kind of betrays how little effort they spent on this phone
The more i use it the more i find absolutely ridiculous beta level workarounds. im getting flashbacks to AOKP, badly hidden settings in nested menus and features that only work half the time - except this is a $800 phone that was sold on contract. a shame really

Categories

Resources