Event-related brain potential (ERP) components are taken to reflect different neural mechanisms related to syntax and semantics in both language and music. While semantic processing is associated with the negative component N400 in both domains, the syntactic one elicits an Early Right Anterior Negativity for music (ERAN), and an Early LeJ Anterior Negativity (ELAN) for language, both peaking around 150-250 ms. Furthermore, the P600 component reflects the structural integration in both domains, while the N5 one, traced only for music, regards the harmonic context build up, i.e., syntactic meaning. Despite evidence for distinguishable electrophysiological responses, musical syntax and semantics do not show a clear cut as language does. Thus, it will be raised a question: it is possible to dissociate semantics from syntax in music as in language? Reviewing the literature, it will be suggested a negative answer, that is, musical syntax has an inherent semantic component, for the reasons that: a) the musical harmonic priming paradigm, which measures the strength of the representations of syntactic hierarchy, has been borrowed from the linguistic semantic priming, which measures the strength of the semantic representation; b) the connectionist model used to account for cognitive representations of musical syntactic hierarchy has been designed on the linguistic semantic net; c) the Chomskian “green ideas” show that syntactic and semantic levels are independent in language, but what about music? Consistent with electrophysiological data, it is not possible to jeopardise the syntactic level without corrupting the semantic one. Furthermore, a glimpse into their ancestor will shed light on their different pragmatic use, hence the way they are processed.