In many cognitive domains, comprehenders construct structured, discrete representations of the environment. Because information is distributed over time, and partial information may not unambiguously identify a single representation, multiple possible structures must be maintained during incremental comprehension. How can the continuous-time, continuous-state neural cognitive system address these challenges? We propose a neural network approach, building on previous research in the Gradient Symbolic Computation framework in the domain of sentence processing. We introduce brick roles, a neurally- plausible, scalable distributed representation encoding binary tree structures. The appropriate structure is computed via an optimization process implementing a probabilistic context-free grammar. In the face of structural uncertainty encountered during incremental parsing, optimization yields conjunctive blends: states where multiple possible structures are simultaneously present (vs. disjunctive representations such as probabilistic mixtures). The degree of blending is controlled via a commitment parameter which drives local parsing decisions. We introduce a novel training algorithm for learning optimization parameters, and an improved policy for controlling commitment over a range of grammars. This provides a computational foundation for developing proposals integrating continuous and discrete aspects of sentence processing.