Meta-data related to musical content have become a very popular research issue over the past years. The affirmation of ID3 embedded meta-data in mp3 files and the release of the Mpeg7 standard are only the surface of a phenomenon that is attracting more and more attention in the research labs of many business oriented companies and universities. The main reason for this interest is that music has just recently, but very rapidly, become a real electronic resource, that is managed through a computer and can be downloaded over the Internet. The consequences of this fact are enormous. First of all, there is a clear need to manage large sets of music files on a hard disk, which is of course quite different from managing them on a bookcase. This issue has been addressed quite successfully by several popular softwares, which allow you to index music files by author, title and genre. Another need is to find your way through the very large offerings of musical content that you can access over the Internet and that is getting more and more indistinct, as it has already happened for other kinds of media: it may not be hard to find something you are looking for, but the discovery of interesting content of which you are not yet aware, might really be difficult. Besides these needs, there are also new possibilities disclosed by this scenario. First of all, the simple title, author and genre paradigm can be enhanced in order to provide more interesting heuristics and services relying on them. Secondly, music can easily be manipulated and listened to in a much more interactive way than on a CD player. All these aspects are targeted by the Semantic Hifi European project, that tries to coordinate scientific research, social investigation, musical knowledge and information science in order to propose new concepts for browsing, searching and listening to musical content. This agenda is really challenging, also because things get really complicated once you start investigating what actually are meta-data related to musical content. Soon you will realize that, besides a very small set of well defined concepts, most of the important parameters that people use to classify music strongly depend on the cultural context or even on subjective matters. And this is not just due to a lack in standardization. There really does not seem to exist a right way to describe music, nor can such a way be invented and imposed on people, if we want it to be useful for their needs. On the other hand, it would make little sense if everybody used his own criteria, because there are for sure many superpositions among different people that can be exploited. For these reasons, we need a description logic that is as flexible as possible, but that still guarantees the maximum amount of interoperability at any degree of customization. This is where RDF/OWL comes in to play. In this discussion we will give an overview on the Semantic Hifi project and examine a set of use cases that outline the role of the Semantic Web standards in our implementation plans.