Look up
positivism in Wiktionary, the free dictionary.
Positivism is a philosophy which states that the only authentic knowledge is scientific knowledge. Positivism was central to the foundation of academic sociology.
Positivism may also refer to:
See also
Topics referred to by the same term