ECE 477 Digital Systems Senior Design Project Rev 9/12

Homework 10: Patent Liability Analysis

Team Code Name: ______Wall-E Prototype I______Group No. ___01___

Team Member Completing This Homework: ______Ranmin Chen______

E-mail Address of Team Member: ____chen518___ @ purdue.edu

Evaluation:

SEC

/

DESCRIPTION

/

MAX

/

SCORE

1.0 /

Introduction

/

5

/
2.0 /

Results of Patent and Product Search

/

40

/
3.0 /

Analysis of Patent Liability

/

30

/
4.0 /

Action Recommended

/

10

/
5.0 /

Summary

/

5

/
6.0 /

List of References

/

10

/

TOTAL

/

100

/

Comments:

Comments from the grader will be inserted here.

1.0  Introduction

The design project Wall-E Prototype I is an intelligent automated trash-collecting robot with obstacle detection capability. The robot responses to either user’s direct manual control, or the object tracking software, to locate and collect the targeted trash object.

This report summarizes the patent liability issues of the project Wall-E Prototype. In this introduction section, an overview of the project, with a focus on patent liability considerations is presented. Following the introduction, section 2.0 provides the specific patents that need to be discussed about. This section provides the patent name, filing date, abstract and key claims of each patent. In section 3.0, analysis of our project’s potential liability is provided. In this section, the similarity and difference of each patent and our project is discussed. Moreover, potential liability for both literally infringing functions and infringement under the doctrine of equivalents is analyzed. In section 4.0, the actions we should take if the potential for infringement exists are discussed. After a brief summary of the report in section 5.0, reference is listed in section 6.0.

2.0  Results of Patent and Product Search

1.  US7720257 Object Tracking System[1]

Filing date: 06/16/2005

Abstract:

A system for tracking objects across an area having a network of cameras with overlapping and non-overlapping fields of view. The system may use a combination of color, shape, texture and/or multi-resolution histograms for object representation or target modeling for the tacking of an object from one camera to another. The system may include user and output interfacing.

Key Claim:

A method for tracking an object using sequential Monte Carlo (SMS) methods, the method comprising: initializing tracking; selecting a camera have a field of view; selecting an object to be tracked from the camera field of view; obtaining a target model of the tracked object; generating a first particle of the target model; predicting a next time's particle by a dynamic model; and calculating features of the first particle.

2.  US7684592 Real-time object tracking system[2]

Filing date: 01/14/2008

Abstract:

A real-time computer vision system tracks one or more objects moving in a scene using a target location technique, which does not involve searching. The imaging hardware includes a color camera, frame grabber and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.

Key Claim:

A method of tracking a target, comprising the steps of: inputting a sequence of images representative of a scene; selecting a target in the scene; computing with a processor the median color of a region around the selected target, and storing this as the color of the target; computing a two-dimensional template of the target based upon the color of the target, and storing this template as the shape of the target; computing the center of the target based upon its shape; a) comparing the center of the target to the center of the target in the previous image of the sequence to determine the motion of the target, if any; b) determining the probable new location of the target based upon the target color, shape and motion; c) using a weighing technique to determine the most probable new location of the target; and d) tracking the target by repeating steps a)-c) for each new image in the sequence of images.

3.  US8055020 Method of object tracking [3]

Filing date: 08/29/2008

Abstract:

The present invention relates to a method for the recognition and tracking of a moving object, in particular of a pedestrian, from a motor vehicle, at which a camera device is arranged. An image of the environment including picture elements is taken in the range of view of the camera device (20) by means of the camera device at regular time intervals and those picture elements are identified with the help of an image processing system which correspond to moving objects to be tracked. A picture element is extracted for each of these objects which represents a projection in image coordinates of that spatial point at which the object contacts a road plane The movement of the corresponding spatial point in the road plane is tracked by means of a state estimator which uses an at least four-dimensional state vector whose components are a position of the spatial point in the road plane and an associated speed in the road plane, wherein the tracking of the movement by the state estimator includes the steps that a prediction is generated for the state vector, this prediction is converted into image coordinates via suitable projection equations, an error to be expected for this prediction is calculated in image coordinates by means of a covariance matrix, and this prediction is compared with the picture element extracted in a later image and is updated.

Key Claim:

A method for recognizing and tracking of a moving object from a motor vehicle having a camera device arranged thereon, the method comprising: taking, using the camera device, images of an environment within a range of view of the camera device at time intervals, said images including picture elements; identifying for each image, with the aid of an image processing system, the picture elements in the image that correspond to a moving object to be tracked, and extracting a picture element that represents a projection in image coordinates of a spatial point where the object contacts a road plane; and tracking movement of the spatial point in the road plane using a state estimator that includes an at least four-dimensional state vector comprising a position of the spatial point in the road plane and an associated speed in the road plane; said tracking comprising the steps of: generating a prediction for the state vector; converting the prediction into image coordinates by suitable projection equations; calculating an error for the prediction in image coordinates by using a covariance matrix; and comparing the prediction with the picture element extracted in a later image, and updating the prediction based upon the comparison.

3.0  Analysis of Patent Liability

1.  US7720257 Object Tracking System[1]

The patent abstract states that the system: “may use a combination of color, shape, texture and/or multi-resolution histograms for object representation or target modeling for the tacking of an object from one camera to another. The system may include user and output interfacing.” The Wall-E Prototype I in fact uses a combination of color, shape, and multi-resolution histograms for object representation and object tracking, which fits the description language in the abstract of the patent. Although, Wall-E Prototype I does not have a multi camera filed, this can be considered as “substantially” same function performed “substantially” the same way. Also, Wall-E Prototype I does have an “user and output interfacing” option for manual control and testing.

2.  US7684592 Real-time object tracking system[2]

There are three factors in the patent may be considered infringement for our Wall-E Prototype I project:

First, in the abstract: “The imaging hardware includes a color camera, frame grabber and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image.”

This is almost an exact description of the hardware and software architecture of the Wall-E Prototype I project.

Second, “Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen.” Because the Wall-E Prototype I provides a manual control option for the user, which requires “the display of live image from the color camera on the computer screen.”

Third, “The system will then keep track of the moving target in the scene in real-time.” Because Wall-E Prototype I is an object tracking robot, by definition, it has to “keep track of the moving target in the scene in real-time.”

3.  US8055020 Method of object tracking [3]

This patent claims that “recognizing and tracking of a moving object from a motor vehicle having a camera device arranged thereon”, which is similar to our project packaging.

4.0  Action Recommended

The manual control option will be taken off from the product if Wall-E Prototype I is made into a commercial product. The manual control option will be solely for manufacturer testing purposes. Because the function is hidden from end user, it can no longer be considered a “function” for the “commercial product”. Therefore eliminating the infringement on the first patent (US7720257) and the second patent (US7684592) for consisting “user and output interfacing”, and “the display of live image from the color camera”. By using a combination of wireless camera (either a commercial one, or a normal web cam with a Raspberry Pi as wifi transmission module) and an Atom board, the project separates the “recognizing” functionality from the camera module on the “motor vehicle”. Because the Atom board is placed off-board as a remote server, this is no longer a motor vehicle which “recognizing and tracking of a moving object”, this is a motor vehicle which “send the video stream back to server for recognizing and tracking”. Thus eliminating infringement on patent three (US8055020).

For the rest of the possible infringements: 1. “may use a combination of color, shape, texture and/or multi-resolution histograms for object representation or target modeling for the tacking of an object from one camera to another.” This can be argued that since the Wall-E Prototype I is a single camera robot, it performs the object tracking function in a substantially “different” way. If the jury decides otherwise, a license has to be purchased from the patent holder. 2. “The system will then keep track of the moving target in the scene in real-time.” This can be argued that the crucial parts of the project: Raspberry Pi and Atom board both run on operating systems, which do not strictly perform tasks in “real time”. Also, the main program on the robot has a polling loop which samples sensor readings first, then act based on the data collected, which is also not a real “real time” mechanism. Again, if the jury decides otherwise, a license needs to be purchased from the patent holder.

5.0  Summary

This report discusses the potential patent infringement issues by comparing the Wall-E Prototype I to real patents which contains similar description as the project. Potential infringements are identified with specific key “claims” in the patents. Based on the key “claims”, a series of future actions are explored for elimination of the infringements.


6.0 List of References

[1]  Morellas Vassilios, Bazakos Michael E., Ma, Yunqian, Johnson Andrew, “Object tracking system”, 06/16/2005, [Online]. Available: http://www.freepatentsonline.com/7720257.pdf

[2]  Paul George V., Beach, Glenn J., Cohen Charles J.,Jacobus, Charles J., “Real-time object tracking system”, 01/04/2008, [Online]. Available: http://www.freepatentsonline.com/7684592.pdf

[3]  Meuter Mirko, Iurgel Uri, “Method of object tracking”, 08/29/2008, [Online]. Available: http://www.freepatentsonline.com/8055020.pdf

-5-