We are interested in developing approaches for monitoring news media effectively and efficiently. This involves a variety of research topics that spans over multipe computer science and data science desciplines
Topic Classification
The automatic labelling of instances into a predefined set of areas of interest is a research field known as topic classification, or topic categorisation. More specifically, we classify articles using their internal content (among other features) into topics that are useful for our client base (e.g., Mergers and Acquisitions or Wearable Technologies). Each one of our articles can be assigned to one or more categories. This type of problem is known as multi-label classification. Signal AI uses a large pool of representation functions which alter the way articles are “seen” by the model. For instance, some topics will be more affected by the textual content of the article while others might perform better when only focusing on the people and companies being mentioned.
Entity Recognition and Disambiguation
We effectively recognise entities (such as people, places, organisations and companies) in news text and relate them to entries in various knowledge bases. To achieve this, we combine named entity recognition methods with entity disambiguation methods and our own internal tools. When evaluating such systems, it is common practice to measure the average performance over a wide range of entities and articles. However, to our clients some entities and mentions are much more important than others, so we have developed tools and evaluation methods that allow for this unevenness.
Sentiment Analysis
Public opinion is important for organisations, especially when attitudes toward products, spokespeople and events have a direct influence on business decisions. An important field of research at Signal AI is sentiment analysis where we develop high performance models that analyse the world's media content for positive and negative sentiment with respect to billions of entity mentions. By providing finer-grain entity-level sentiment analysis, our users are better able to capture key insights on feelings surrounding their most important entities.
Anomaly Detection
Anomaly detection involves modelling historical patterns of media coverage for topics and entities of interest, and then using these models as a background knowledge to identify unusual trends and events that divert from the norm. We apply anomaly detection retrospectively and in a streaming mode. The former enables creating a summary of the important highlights (anomalies) that are identified during a certain period of time. In the streaming mode, alerts are created in real-time to push notifications for users with detected trends or events (anomalies).
Data Visualisation
Our users risk being overwhelmed with data and risk missing important information. We use a variety of visualisation methods to help them find critical information quickly.