The K-means and EM clustering algorithm rely on an expectation step. In the case of finite mixture models this expectation step is often called responsibility. Responsibility is closely related to the softmax layer commonly used for classification tasks in deep learning. With a small change, the two may be combined to form a Responsible Softmax layer. I briefly show the connection between the two and share how well it works on data generated by a Gaussian Mixture Model.