不太理解触摸开始和触摸结束

  • 本文关键字:触摸 开始 结束 ios touch
  • 更新时间 :
  • 英文 :


我看过Apple的示例触摸应用程序,它没有解决我的问题。在其示例应用中,当发生触摸事件时,它会尝试将视图保持在发生触摸事件的位置。这使得逻辑变得简单。它们只是查找其框架包含触摸位置的视图。

这在我的场景中不起作用。这是我的场景。

有一个视图包含一堆子视图。这个想法是允许用户在他们手势的方向上抛出其中一个子视图。我希望touchesBegan事件找到中心最接近触摸的视图。

然后,我希望touchesEnded事件以由开始和结束事件确定的速度移动同一视图。速度不一定与手指速度相同,因此我不能像 Apple 在示例应用程序中所做的那样简单地将视图"附加"到触摸位置。

我想过将touchesBegan中标识的视图与触摸对象标记,并使用它来与touchesEnded事件中的触摸对象进行比较,但这不起作用。对于touchesBegantouchesEnded事件,触摸对象不同。

那么我错过了什么?如何保存要移动的视图和触摸之间的关系?

TouchObject 不会相似。对象将随着触摸的视图、位置等而变化。 详情请查看UITouch Class。

建议1:

我通过子类化 UIView并在子类中添加触摸委托在我的项目中实现了同样的事情。 我创建了这个子类的实例,而不是普通的UIView。

@interface myView : UIView
@end

@implementation myView
//over ride the following methods
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent *)event
{
}
@end

建议2

使用 UIPanGesture 来做同样的事情。这将是一个更简单的方法。

UIPanGestureRecognizer *panGesture = [[[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(panGestureMoveAround:)] autorelease];
[panGesture setMaximumNumberOfTouches:2];
[panGesture setDelegate:self];
[yourSubView addGestureRecognizer:panGesture];
-(void)panGestureMoveAround:(UIPanGestureRecognizer *)gesture;
{
UIView *piece = [gesture view];
[self adjustAnchorPointForGestureRecognizer:gesture];
if ([gesture state] == UIGestureRecognizerStateBegan || [gesture state] == UIGestureRecognizerStateChanged) {
CGPoint translation = [gesture translationInView:[piece superview]];
[piece setCenter:CGPointMake([piece center].x + translation.x, [piece center].y+translation.y*0.1)];
[gesture setTranslation:CGPointZero inView:[piece superview]];
}
}

一个例子在这里

这是我的解决方案。在此解决方案中,箭头视图在其中移动的视图scaledView。它们属于spriteView类,是UIView的一个子类。

-(spriteView *)findArrowContainingTouch:(UITouch *)touch inView:(UIView *)scaledView atPoint:(CGPoint) touchPoint
//There could be multiple subviews whose rectangles include the point. Find the one whose center is closest.
{
spriteView * touchedView = nil;
float bestDistance2 = 9999999999.9;
float testDistance2 = 0;
for (spriteView *arrow in scaledView.subviews) {
if (arrow.tag == ARROWTAG) {
if (CGRectContainsPoint(arrow.frame, touchPoint)) {
testDistance2 = [self distance2Between:touchPoint and:arrow.center];
if (testDistance2<bestDistance2) {
bestDistance2 = testDistance2;
touchedView = arrow;
}
}
}
}
return touchedView;
}

该方法distance2Between计算两点之间距离的平方。

-(spriteView *)findArrowTouchedAtLocation:(CGPoint)p inView:(UIView *)scaledView
{
spriteView * arrow = nil;
for (spriteView *testArrow in scaledView.subviews) {
if (testArrow.tag == ARROWTAG) {
if ((p.x == testArrow.touch.x) && (p.y == testArrow.touch.y)) {
arrow = testArrow;
}
}
}
return arrow;
}

有些子视图在scaledView中不是箭头,所以我使用常量 ARROWTAG 来识别箭头。

#pragma mark - Touches
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *scaledView = [self getScaledView];
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:scaledView];
spriteView * arrow = [self findArrowContainingTouch:touch inView:scaledView atPoint:touchLocation];
if (!(arrow==nil)) {
//Record the original location of the touch event in a property, originalTouchLocation, of the arrow instance. Additionally, store the same point on the property touch.
//Both properties are necessary. The originalTouchLocation will be used in `touchesEnded` and is not available in the `touch` object. So the information is stored separately.
//The `touch` property, a CGPoint, is stored in order to identify the view. This property is updated by every touch event. The new value will be used by the upcoming event to find the appropriate view.
arrow.touch = CGPointMake(touchLocation.x, touchLocation.y);
arrow.originalTouchLocation = CGPointMake(touchLocation.x, touchLocation.y);
arrow.debugFlag = YES;
arrow.timeTouchBegan = touch.timestamp;
}
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *scaledView = [self getScaledView];
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:scaledView];
CGPoint previousLocation = [touch previousLocationInView:scaledView];
//previousLocation is used to find the right view. This must be in the coordinate system of the same view used in `touchesBegan`.
spriteView * arrow = [self findArrowTouchedAtLocation:previousLocation inView:scaledView];
if (!(arrow==nil)) {
arrow.touch = CGPointMake(touchLocation.x, touchLocation.y);
}
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *scaledView = [self getScaledView];
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:scaledView];
CGPoint previousLocation = [touch previousLocationInView:scaledView];
spriteView * arrow = [self findArrowTouchedAtLocation:previousLocation inView:scaledView];
if (!(arrow==nil)) {
arrow.touch = CGPointMake(touchLocation.x, touchLocation.y);
float strokeAngle = [self findAngleFrom:arrow.originalTouchLocation to:touchLocation];
float strokeDistance2 = sqrt([self distance2Between:arrow.originalTouchLocation and:touchLocation]);
NSTimeInterval timeElapsed = touch.timestamp - arrow.timeTouchBegan;
float newArrowSpeed = strokeDistance2 / timeElapsed / 100; //might want to use a different conversion factor, but this one works quite well
arrow.transform = CGAffineTransformMakeRotation(strokeAngle);
arrow.currentSpeed = newArrowSpeed;
}
}
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *scaledView = [self getScaledView];
for (UITouch *touch in touches) {
CGPoint previousLocation = [touch previousLocationInView:scaledView];
spriteView * arrow = [self findArrowTouchedAtLocation:previousLocation inView:scaledView];
if (!(arrow==nil)) {
arrow.originalTouchLocation = CGPointMake(99999.0, 99999.0);
NSLog(@"Arrow original location erased");
}
}
}

触摸对象实际上在touchesBegantouchesEnded中是相同的。事实证明,这是跟踪触摸的方法。

最新更新