Lines Matching refs:allocate

541       // again to allocate from it.
563 "could not allocate from secondary_free_list");
570 "the only time we use this to allocate a humongous region is "
595 // Currently, only attempts to allocate GC alloc regions set
629 // Only one region to allocate, no need to go through the slower
639 // We can't allocate humongous regions while cleanupComplete() is
643 // that we only need to do this if we need to allocate more than
685 // The word size sum of all the regions we will allocate.
764 // re-allocate them. We can extend is_empty() to also include
815 // If the number of regions we're trying to allocate for this
891 // after a Full GC, it's unlikely we'll be able to allocate now.
925 // (attempt_allocation()) failed to allocate.
946 // allocate a new region. So the mutator alloc region should be NULL.
986 // failed to allocate. No point in trying to allocate
1091 // If we failed to allocate the humongous object, we should try to
1104 // failed to allocate. No point in trying to allocate
1171 // in a region we might allocate into, then it would prevent that card
1791 // expand_by() was unable to allocate the HeapRegion instances
2158 vm_exit_during_initialization("Failed to allocate initial heap.");
2204 // Here we allocate the dummy full region that is required by the
2445 // And as a result the region we'll allocate will be humongous.
2455 // If we can't allocate once, we probably cannot allocate
2774 // closure on the "starts humongous" region might de-allocate
4271 // has been subsequently used to allocate a humongous
4281 // we allocate to in the region sets. We'll re-add it later, when
4410 // Forward-to-self failed. Either someone else managed to allocate
4457 // Let's try to allocate in the old gen in case we can fit the
4467 // Let's try to allocate in the survivors in case we can fit the
4494 // we allocate G1YoungSurvRateNumRegions plus one entries, since
4657 HeapWord* obj_ptr = _par_scan_state->allocate(alloc_purpose, word_sz);
4676 // We're going to allocate linearly, so might as well prefetch ahead.
5965 // dirty allocated blocks as they allocate them. The thread that