While values(*field_excluding_pk).distinct() and
distinct(*field_excluding_pk) can reduce the number of resulting rows
in a way that makes subsequent delete() calls ambiguous standalone
.distinct() calls cannot.
Since delete() already disallows chain usages with values() the only
case that needs to be handled, as originally reported, is when
DISTINCT ON is used via distinct(*fields).
Refs #32682 which had to resort to subqueries to prevent duplicates in
the admin and caused significant performance regressions on MySQL
(refs #34639).
This partly reverts 6307c3f1a123f5975c73b231e8ac4f115fd72c0d.
Special thanks to Hannes Ljungberg for finding multiple implementation
gaps.
Thanks also to Simon Charette, Adam Johnson, and Mariusz Felisiak for
reviews.
Black 23.1.0 is released which, as the first release of the year,
introduces the 2023 stable style. This incorporates most of last year's
preview style.
https://github.com/psf/black/releases/tag/23.1.0
By avoiding to annotate aggregations meant to be possibly pushed to an
outer query until their references are resolved it is possible to
aggregate over a query with the same alias.
Even if #34176 is a convoluted case to support, this refactor seems
worth it given the reduction in complexity it brings with regards to
annotation removal when performing a subquery pushdown.
Until we add support for truly asynchronous database backends it's
actually detrimental to have complete set retrieval require multiple
hops between sync and async emulated contexts via asgiref.
By defaulting to sending the whole sync _fetch_all() to the current
context thread pool we reduce unncessary work when dealing with large
result sets since the queryset cannot be iterated anyway before all
results are retrieved and cached.
The lack of _for_write = True assignment in bulk_update prior to
accessing self.db resulted in the db_for_read database being used to
wrap batched UPDATEs in a transaction.
Also tweaked the batch queryset creation to also ensure they are
executed against the same database as the opened transaction under all
circumstances.
Refs #23646, #33501.
In these cases Black produces unexpected results, e.g.
def make_random_password(
self,
length=10,
allowed_chars='abcdefghjkmnpqrstuvwxyz' 'ABCDEFGHJKLMNPQRSTUVWXYZ' '23456789',
):
or
cursor.execute("""
SELECT ...
""",
[table name],
)
The functionality of updating the __dict__ was only present to allow
for pickling a Prefetch object, which is a comparably rare operation.
Forcing the __getstate__() implementation to handle it allows the
chaining operation to be slightly faster, which is much more common.