The use of social media runs through our lives, and users' emotions are also affected by it. Previous studies have reported social organizations and psychologists using social media to find depressed patients. However, due to the variety of content published by users, it isn't effortless for the system to consider the text, image, and even the hidden information behind the image. To address this problem, we proposed a new system for social media screening of depressed patients named BlueMemo. We collected real-time posts from Twitter. Based on the posts, learned text features, image features, and visual attributes were extracted as three modalities and were fed into a multi-modal fusion and classification model to implement our system. The proposed BlueMemo has the power to help physicians and clinicians quickly and accurately identify users at potential risk for depression.