Human cognitive and analytical capabilities are needed and are indispensable to success in cyber defense. However, the high volume of network data challenges the process of detecting cyber-attacks, especially zero-day attacks. Training along with detailed and timely outcome feedback is a major factor in improving performance. It supports attributes identification and rule formation, which are crucial to the detection of attacks. To understand the role of feedback during training and how it influences the detection of novel attacks, we developed a simplified Intrusion Detection System and trained 160 participants to perform as analysts. During training, participants classified network events representing a specific cyber-attack, and received feedback at the end of each trial. Detailed feedback used color schemes informing of hits, misses, false-alarms, and correct-rejections. Aggregated feedback provided numerical summaries regarding performance. After training, participants classified events that were similar or part of a novel attack. Results show that detailed feedback accelerated learning and improved detection accuracy compared to aggregated feedback. Participants who received aggregated feedback failed to learn the role of certain network attributes and how to integrate them into detection rules. Surprisingly, aggregated feedback improved detection in the novel attack. The novelty of a situation caused an increase in decision scrutiny, while familiar decision situations limited the depth of information search and evaluation. Analyst should learn to abstract information and look broadly at outcome feedback to improve their ability to detect novel attacks. We discuss the implications of these findings for improving cyber security.