Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Deduplicate allocation fallback in PyRef::new_ref
  • Loading branch information
youknowone committed Mar 4, 2026
commit 2266d940fc13e6a001261c48bf7c5336c6a982c3
35 changes: 18 additions & 17 deletions crates/vm/src/object/core.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1783,24 +1783,25 @@ impl<T: PyPayload + crate::object::MaybeTraverse + core::fmt::Debug> PyRef<T> {
let is_heaptype = typ.heaptype_ext.is_some();

// Try to reuse from freelist (exact type only, no dict, no heaptype)
let ptr = if !has_dict && !is_heaptype {
if let Some(cached) = unsafe { T::freelist_pop() } {
let inner = cached.as_ptr() as *mut PyInner<T>;
unsafe {
core::ptr::write(&mut (*inner).ref_count, RefCount::new());
(*inner).gc_bits.store(0, Ordering::Relaxed);
core::ptr::drop_in_place(&mut (*inner).payload);
core::ptr::write(&mut (*inner).payload, payload);
// typ, vtable, slots are preserved; dict is None, weak_list was
// cleared by drop_slow_inner before freelist push
}
// Drop the caller's typ since the cached object already holds one
drop(typ);
unsafe { NonNull::new_unchecked(inner.cast::<Py<T>>()) }
} else {
let inner = Box::into_raw(PyInner::new(payload, typ, dict));
unsafe { NonNull::new_unchecked(inner.cast::<Py<T>>()) }
let cached = if !has_dict && !is_heaptype {
unsafe { T::freelist_pop() }
} else {
None
};

let ptr = if let Some(cached) = cached {
let inner = cached.as_ptr() as *mut PyInner<T>;
unsafe {
core::ptr::write(&mut (*inner).ref_count, RefCount::new());
(*inner).gc_bits.store(0, Ordering::Relaxed);
core::ptr::drop_in_place(&mut (*inner).payload);
core::ptr::write(&mut (*inner).payload, payload);
// typ, vtable, slots are preserved; dict is None, weak_list was
// cleared by drop_slow_inner before freelist push
}
// Drop the caller's typ since the cached object already holds one
drop(typ);
unsafe { NonNull::new_unchecked(inner.cast::<Py<T>>()) }
Comment on lines +1792 to +1804
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Reused freelist objects keep the old typ, which can return objects with stale class identity.

In the cached-object branch, the incoming typ is dropped and the cached typ is preserved. Because freelists are keyed by payload type T (not by typ instance), reuse across context lifetimes on the same thread can bind a new object to an old typ.

Suggested fix
 let ptr = if let Some(cached) = cached {
     let inner = cached.as_ptr() as *mut PyInner<T>;
     unsafe {
         core::ptr::write(&mut (*inner).ref_count, RefCount::new());
         (*inner).gc_bits.store(0, Ordering::Relaxed);
+        core::ptr::drop_in_place(&mut (*inner).typ);
+        core::ptr::write(&mut (*inner).typ, PyAtomicRef::from(typ));
         core::ptr::drop_in_place(&mut (*inner).payload);
         core::ptr::write(&mut (*inner).payload, payload);
-        // typ, vtable, slots are preserved; dict is None, weak_list was
+        // vtable, slots are preserved; dict is None, weak_list was
         // cleared by drop_slow_inner before freelist push
     }
-    // Drop the caller's typ since the cached object already holds one
-    drop(typ);
     unsafe { NonNull::new_unchecked(inner.cast::<Py<T>>()) }
 } else {

As per coding guidelines: "Follow Rust best practices for error handling and memory management".

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let ptr = if let Some(cached) = cached {
let inner = cached.as_ptr() as *mut PyInner<T>;
unsafe {
core::ptr::write(&mut (*inner).ref_count, RefCount::new());
(*inner).gc_bits.store(0, Ordering::Relaxed);
core::ptr::drop_in_place(&mut (*inner).payload);
core::ptr::write(&mut (*inner).payload, payload);
// typ, vtable, slots are preserved; dict is None, weak_list was
// cleared by drop_slow_inner before freelist push
}
// Drop the caller's typ since the cached object already holds one
drop(typ);
unsafe { NonNull::new_unchecked(inner.cast::<Py<T>>()) }
let ptr = if let Some(cached) = cached {
let inner = cached.as_ptr() as *mut PyInner<T>;
unsafe {
core::ptr::write(&mut (*inner).ref_count, RefCount::new());
(*inner).gc_bits.store(0, Ordering::Relaxed);
core::ptr::drop_in_place(&mut (*inner).typ);
core::ptr::write(&mut (*inner).typ, PyAtomicRef::from(typ));
core::ptr::drop_in_place(&mut (*inner).payload);
core::ptr::write(&mut (*inner).payload, payload);
// vtable, slots are preserved; dict is None, weak_list was
// cleared by drop_slow_inner before freelist push
}
unsafe { NonNull::new_unchecked(inner.cast::<Py<T>>()) }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@crates/vm/src/object/core.rs` around lines 1792 - 1804, The cached-object
branch currently drops the incoming typ and leaves the cached object's typ
unchanged, causing stale class identity; instead, in the cached branch (inside
the unsafe block handling inner: *mut PyInner<T>) replace the stored typ with
the incoming typ: drop the old typ in-place (core::ptr::drop_in_place(&mut
(*inner).typ)) and core::ptr::write(&mut (*inner).typ, typ) (or equivalently
move typ into (*inner).typ) before leaving the unsafe block, then do not call
drop(typ) afterwards; this ensures PyInner<T>::typ is updated when reusing
freelist objects while preserving the existing payload/vtable handling.

} else {
let inner = Box::into_raw(PyInner::new(payload, typ, dict));
unsafe { NonNull::new_unchecked(inner.cast::<Py<T>>()) }
Expand Down
Loading