CS587 Project contract

iCampusVideo

first/last 999-999-999

first/last 999-999-999

Project feature description:

Client (Phone) side:

  • The user can specify the address of the web-server to send data to in the settings for the App.
  • Once the App is started, the user will press a button on the phone to start the recording.
  • While recording, roughly every 10 seconds, metadata (location, altitude, time, angle, direction, speed)will be read from the sensors and sent along with the video to the server.
  • The user can press a stop button once they wish to end the recording.

Server side:

A web-server will be running on the server side to process three types of requests.

The main user interface on the server shall be a webpage with Google maps, together with some control buttons/links that you need for the app. Video data and metadata is stored on the server, but it is also indexed in the database.

1) Record:

  • The phone will initiate a connection with the server and begin streaming video and metadata information to the web-server.
  • The web-server stores the phone’s recorded video and metadata on a file server.
  • The metadata is processed to generate spatial index information.

The GUI for recording is on the mobile phone. The server receives stream data on the backend.

2) Query:

  • When a user on a computer web browser navigates to the web-server address, it will display a web page with a Google Map. With the default view at USC campus and maybe your current location on the map.
  • The user can click a point or drag a box on the map to make a query.
  • On the mobile app, the user can change from recording mode to querying mode to make a query as well.
  • The query is sent to the web-server to process and find recorded videos that cover the queried point or area at any time of the recording
  • Query results are returned to the user and displayed as interactive points on the map that intersect the query area or as a list on the side.

3) Play Video:

  • When a user clicks on one of the points or an item in the list, the video will display on one side of the page.
  • During playback, the map is still active and will display the recorded trajectory and current viewing area pertaining to the displayed video.

Schedule:

Completed by midterm:

  • Mobile interface for recording.
  • Sending video data and some metadata(location, time, altitude) to the back-end.
  • Back-end for storing video data and metadata.
  • Ability to play a video stored on the server from a browser.

Completed by final deadline:

  • Obtaining and storing the rest of the metadata.
  • Database for indexing metadata.
  • Mobile and web-server interface for querying, with map display of results.
  • Ability to play the video in the browser with the map showing the position of the recorder.
  • Ability to play the video on the mobile app.

GUI:

Web Browser:

User clicks on the Query button and can either click or drag a box on the map

The query brings back a list of results, displayed on the map and in the results box.

When a user clicks on one of the results, the video will start playing along with the location display on the left that shows the location of the recorder as the video plays.

Mobile App:


(a) User Stars Recording(b) User clicks on Stop recording


(c) User can now send or delete (d) User clicks on query to display map


(e) The map will allow user to place marker and clicking on search will display videos as clickable icons on the same screen. On clicking video icon the meta-information will be displayed in space below search button.