Advanced Techniques
Custom Sorting with Key Function
Multi-Dimensional Sorting
## Sorting complex objects
students = [
{'name': 'Alice', 'score': 85, 'age': 22},
{'name': 'Bob', 'score': 85, 'age': 20},
{'name': 'Charlie', 'score': 90, 'age': 21}
]
## Max by multiple criteria
top_student = max(students, key=lambda x: (x['score'], -x['age']))
print(f"Top student: {top_student['name']}")
flowchart TD
A[Input Data] --> B{Custom Key Function}
B --> C[Efficient Comparison]
C --> D[Minimal Iterations]
D --> E[Optimal Result]
Advanced Comparison Techniques
Technique |
Description |
Example |
Lambda Functions |
Custom comparison logic |
max(items, key=lambda x: x.property) |
Multiple Criteria |
Complex sorting |
max(items, key=lambda x: (x.a, x.b)) |
Nested Comparisons |
Hierarchical sorting |
max(objects, key=attrgetter('attr1', 'attr2')) |
Handling Complex Data Structures
Nested List Comparison
## Max in nested lists
nested_lists = [[1, 2], [3, 4], [5, 6]]
max_sublist = max(nested_lists, key=sum)
print(f"Sublist with maximum sum: {max_sublist}")
Functional Programming Approaches
Using operator Module
import operator
## Advanced max with operator
items = [
{'id': 1, 'value': 10},
{'id': 2, 'value': 20},
{'id': 3, 'value': 15}
]
max_item = max(items, key=operator.itemgetter('value'))
print(f"Maximum value item: {max_item}")
Memory-Efficient Techniques
Generator Expressions
## Max with generator expressions
def large_data_generator():
for i in range(1000000):
yield i
max_value = max(large_data_generator())
print(f"Maximum value: {max_value}")
LabEx Pro Tip
Leverage advanced max() techniques to write more efficient and readable Python code.
Error Handling and Edge Cases
Robust Comparison
## Handling None and mixed types
def safe_max(*args):
try:
return max(arg for arg in args if arg is not None)
except ValueError:
return None
result = safe_max(10, None, 20, 5)
print(f"Safe max result: {result}")
Parallel Processing Considerations
Max in Parallel Computing
from multiprocessing import Pool
def find_max_chunk(chunk):
return max(chunk)
def parallel_max(data):
with Pool() as pool:
chunks = [data[i:i+1000] for i in range(0, len(data), 1000)]
max_values = pool.map(find_max_chunk, chunks)
return max(max_values)
large_list = list(range(1000000))
result = parallel_max(large_list)
print(f"Parallel max: {result}")