Design & Development of adaptive search engine with near-real time performance for use in assistive applications

posted Dec 21, 2010, 11:32 PM by Unknown user   [ updated Oct 13, 2011, 12:15 AM ]

This project, aims at helping the elderly in old age homes, who have very less family support & for them to be able to connect to the world by simple voice driven commands. 

To make this adaptive for every human, we devised a mechanism to divide every sentence into an action-reaction analogy. So, every sentence the user says (input), is recorded and, over time an optimal match for that sentence is returned (output). 

Here in the program we call every input sentence as an "Action" sentence. This Action sentence triggers the search engine, which looks for the output (which we refer to as "Reaction" sentence). 

The most important challenge for the project was to selectively pick two sentence from a large set of sentences, which were collected over time. The map created between the two sentence was then verified after several iterations, before we could successfully create a one to one map. 

This process was carried out on for all set of sentences which were said by the elderly/user. After several attempts, it was found that, the algorithm is able to generate a one to one map for all the input sentences. 

The idea behind the adaptive algorithm is to develop an application which can cater to the elderly people and their ever extending needs. The algorithm developed has modules such that it learns according to human behavior and actions (which are logged and later analyzed).

Run-Time Complexity:-

The algorithm has been tested for huge sets of randomly generated input data and the complexity of the algorithm has been found out to be O(n). Techniques from Artificial Intelligence, Fuzzy Logic and Neural Networks were exploited for the project. 


For calculating the time complexity a log file was created with fairly large input (106 lines) and time was noted while the algorithm was analyzing the file. The time complexity thus obtained (O(n)) from the graph which was plotted (shown below). 

The figure above shows the number of matches between Action-Reaction pairs Vs. Input File Size. 

The flow chart below, explains the entire working of the project. 

This approach for the project would also be presented in the proceedings of HCII-2011, as a short-paper. (To appear)

The entire project is a part of the bigger project, Touch-Lives Initiatives

Future Implementation: 
Presently, it is a mobile based application. The future application aims to make the device invisible and suggestions in the form of spoken commands as opposed to the present implementation of output-text-display. Also, to make it available to the elderly at a very low cost.


1) An Optimal Human Adaptive Algorithm to find Action-Reaction Word Pairs. (Short Paper, In Press) Accepted for publication in the 14th International Conference on Human Computer Interaction.