Rastier, François: "On Signs and Texts"


306




It should be added that the division of knowledge into physics, ethics and semiotics, even as it is understood by Eco, remains tailored on a scholastic pattern. It does not take into account one major innovation __ the progressive constitution of aesthetics from the Renaissance on to the Enlightenment, and the fact that it was integrated into philosophy by Baumgartner, Wolff and Kant. Admittedly, some cognitive scientists (especially Petitot) resumed the Kantian program of transcendantal aesthetics, as a theory of the a priori conditions of perception, but without posing the problem of judgements of taste. In fact, cognitive research is conspicuously devoid of aesthetic questions.
For traditional cognitivism in particular, things are simple: there is only one world, and one type of science only. It actually perpetuates logical positivism: through the thesis of a unity in science, the Vienna Circle had set out not only to eliminate metaphysics but also to equalize all sciences on the common methodological grounds of the physical sciences. The agenda to "naturalize" meaning, in which Sperber justifiably sees "the Holy Grail of cognitive philosophy" (see note 21), arose directly from there.
In this agenda, semiotics is only one aspect of the physical world. As for semiotics, as a discipline, it plays a very low key role, and debates have primarily circled around one of its sub-disciplines, linguistics __ that is provided it is defined as the semiotics of languages, an apposite description in my view. In fact, the only semiotic theory followed by cognitivism, albeit in a non-critical way, is Morris and Carnap's semiotics, namely the division between syntax, semantics, and pragmatics.
It is not truly relevant here that this tripartition has been, for a half-century, the most obtrusive epistemological stumbling block in the development of language sciences. It is a highly functional proviso for cognitivism. Indeed, logical positivism carries a conatural contradiction between its empiricism and its logical grand scheme (objects of perception do not share in the qualities of ontological stability and identity to oneself of logical objects). Cognitivism allays the mismatch in a semantic way: the signification of signs is defined and guaranteed by atomic entities. Semantics, as it defines the pairing of word and object, solves the empirical difficulty, whereas syntax, as a rigorously regulated combinatory, confers logical validity on the predicative qualifications of objects.
However, cognitivism obviously does not share the anti-psychologism of logical positivism. It restores the possibility of subjecting representations to scientific description. Correlatively, it breaks free from a theory of direct denotation, a radical innovation brought by Carnap, which quite simply did away with the conceptual pole of signification, thereby defined as the direct linkage of a symbol (or signifier, in Saussurian terms) and a referent. The theory of necessary and sufficient conditions (NSC) in semantics bears witness to the reduction.
In retrospect, however much it vilified behaviourism, cognitivism appears to improve on it somewhat: it maintains the input/output model, which enables it to continue the data-processing metaphor, but it justifiably keeps the inspection of the "black box" as a possible option. However, the conceptual level re-instituted in the process is only given a single format, the propositional format (as propositional computation). The theory of direct denotation, in line with formal languages, is thus maintained and transposed, at the cost of some psychological involutedness, within the new framework of a computational psychology.






Page - 1     Page + 1


AS/SA Nº5, Article 4 : Page 4 / 27

© 1998, AS/SA

E-mail to the editors
Pour écrire à la rédaction

1996.06.22