We propose a general framework, dubbed stochastic processing under imperfect information, to study the impact of information constraints and memories on dynamic resource allocation. The framework involves a stochastic processing network (SPN) scheduling problem in which the scheduler may access the system state only through a noisy channel, and resource allocation decisions must be carried out through the interaction between an encoding policy (that observes the state) and allocation policy (that chooses the allocation). Applications in the management of large-scale data centers and human-in-the-loop service systems are among our chief motivations. We quantify the degree to which information constraints reduce the size of the capacity region in general SPNs and how such reduction depends on the amount of memories available to the encoding and allocation policies. Using a novel metric, capacity factor, our main theorem characterizes the reduction in capacity region (under “optimal” policies) for all nondegenerate channels and across almost all combinations of memory sizes. Notably, the theorem demonstrates, in substantial generality, that (1) the presence of a noisy channel always reduces capacity, (2) more memory for the allocation policy always improves capacity, and (3) more memory for the encoding policy has little to no effect on capacity. Finally, all of our positive (achievability) results are established through constructive, implementable policies. Our proof program involves the development of a host of new techniques, largely from first principles, by combining ideas from information theory, learning and queuing theory. As a submodule of one of the policies proposed, we create a simple yet powerful generalization of the maximum-weight (max-weight) policy, in which individual Markov chains are selected dynamically, in a manner analogous to how schedules are used in a conventional max-weight policy.