This project advocate for a user-centric approach that can quantify the dimensions of the trust experience, when, instead of another human, one engages with a complex system. Contributes to enhancing transparency, user intervenability, and accountability by proposing to develop a user-friendly service to support designing trusted interactions. This service besides quantifying what aspects of a complex system induces participants' self-reported experience of trust; also explains to what extent can these be generalized and applied to design principles. It combines, as well, context-aware methods to assess to what extent this tool can support the reflection on ethical concerns associated with trustworthiness and indications of interacting perceived risk. Builds upon three assumptions, the first trust is a key to support and minimize the risk associated with data breaches and misuse; second in detriment of the default mainstream attention given to privacy and security concerns we continue to see individuals as a product that we can manipulate to our preferences. Third, existing user experience (UX) evaluation methods either present complex interpretations of what trust is and what it represents in context or oversimplify ; Forgetting, that trusting is highly dependent on social context and that individual's reactions to it are social. Lacking, for instance, to fully undertake the complexity relation between existing distrusted technology-reliant ecosystems and individual's interpretations of privacy concerns, strategies, and needs. Sometimes resulting in designing systems that enforce individual’s to trust even when fully aware that privacy and data breaches can occur ( Facebook). This lack of explicit trust communication and transparency disregard the fact that trusting is an important factor in any social interaction, even if the entities we are interacting with are not human. Disregards, as well the fact that once the trust bond is broken it is hard to recover.