Welcome to part 2 of our advanced Python programming guide! The beginner tutorial covered core concepts like syntax, data structures, functions, classes, and modules.
Now, let's dive deeper into some advanced techniques experienced Python developers use. This guide assumes you already have a strong grasp of Python basics.
Object-Oriented Design Principles
Python supports object-oriented programming, allowing developers to organize code into class hierarchies and modeled after real-world entities.
Mastering OOP techniques is key to designing and building robust large-scale applications in Python.
Some key object-oriented design principles:
- Encapsulation
This involves bundling related attributes and behaviors into individual classes. For example, a Person class would encapsulate properties like name, age, and behaviors like walking and talking.
Encapsulation allows control over the data through well-defined interfaces along with information hiding. Methods act as the interface, while attributes are kept private to hide complexity.
- Inheritance
Classes can inherit commonly used state and behaviors from parent classes. For example, a Student class can inherit from a base Person class to avoid rewriting duplicated code. The child class only needs to define properties and methods unique to students.
Inheritance enables reuse of code and polymorphism. Subclasses can extend, override, or modify inherited logic as needed.
- Abstraction
This involves exposing only relevant data/methods through public interfaces while hiding unnecessary implementation details. For example, an abstract class Shape could define an area() method while concrete subclasses Circle, Square implement the actual area calculations.
Abstraction reduces complexity and couples code loosely by separating high-level and low-level logic. Interfaces help maintain abstraction.
- Composition
This refers to combining objects to model complex behaviors. Rather than rely on inheritance alone, objects can use other objects via composition. For example, a Car class could compose objects like a Wheel or engine rather than directly inheriting their capabilities.
Composition provides flexibility and encapsulation for complex object interactions.
By following these principles, Python programs can implement domain entities and business logic in an organized, object-oriented manner. Let's look at some examples.
We'll model a zoo management system with classes for animals, zookeepers, and enclosures using inheritance, polymorphism, encapsulation, and composition:
# Animal base class
class Animal:
def __init__(self, name, species):
self.name = name
self.species = species
def make_sound(self):
print(f"{self.name} says Rawwwr!")
# Inherited child classes
class Lion(Animal):
def make_sound(self):
print(f"{self.name} says Roar!")
class Snake(Animal):
def make_sound(self):
print(f"{self.name} says Hiss!")
# Zookeeper class
class Zookeeper:
def __init__(self, name):
self.name = name
def feed_animal(self, animal):
print(f"{self.name} is feeding the {animal.species} named {animal.name}")
animal.make_sound() # Polymorphism
# Enclosure class
class Enclosure:
def __init__(self, id, animals):
self.id = id
self.animals = animals
def add_animal(self, animal):
self.animals.append(animal)
# Create objects
leo = Lion("Leo", "Lion")
marty = Snake("Marty", "Python")
bob = Zookeeper("Bob")
snakes_enclosure = Enclosure(123, [marty])
bob.feed_animal(leo)
snakes_enclosure.add_animal(leo)
This demonstrates modeling domain entities with encapsulation, inheritance, polymorphism, and composition. Code reuse is improved, coupling reduced, and abstraction maintained.
Python Decorators
Decorators dynamically alter the functionality of a function, method, or class without having to modify the code directly. They essentially wrap the original object and modify its behavior as needed before executing it.
Decorators start with the @ symbol and are placed at the definition. For example:
@timer
def run_long_job(args):
# Function body
Here @timer is a decorator that measures how long run_long_job takes to execute.
Let's see how to build this timer decorator:
import time
def timer(func):
# Inner wrapper function
def inner(*args, **kwargs):
start = time.time()
result = func(*args, **kwargs)
end = time.time()
print(f"Execution took {end-start} seconds")
return result
# Return inner function
return inner
@timer
def long_running_job(n):
print("Running long job...")
time.sleep(n)
return "Done!"
long_running_job(5)
# Prints execution time
When decorated, long_running_job behavior is extended with the timer functionality without modifying its code.
Some other example uses cases for Python decorators:
Logging function arguments and results
Checking permissions or roles before executing functions
Caching return values to avoid recalculation
Rate limiting function calls
Instrumenting code for tracing or profiling
Validating input data types
Decorators supercharge Python with metaprogramming capabilities and expressiveness. They are widely used across frameworks like Flask, Django, etc. Understanding decorators unlocks their powerful capabilities.
Concurrency in Python
Concurrency refers to executing multiple tasks simultaneously through parallelization or asynchronous programming. Python supports various concurrency models to improve program efficiency and speed.
Some approaches include:
- Threads
Threads allow execution of code in parallel within the same interpreter process. The OS schedules thread execution across cores.
For IO-bound tasks, threads improve the utilization of idle time that would otherwise be spent waiting. The threading module supports spawning and synchronizing threads:
from threading import Thread
def io_bound_worker():
# Perform IO intensive work
threads = [Thread(target=io_bound_worker) for _ in range(8)]
for thread in threads:
thread.start()
# Main thread continues executing
Multiprocessing
For CPU-bound tasks, Python's multiprocessing module distributes work across multiple processes. Each CPU core runs a separate Python interpreter, circumventing the GIL limitation.
Processes have higher overhead than threads but enable true parallelism across multiple CPUs:
from multiprocessing import Process
def cpu_bound_worker(data):
# Perform heavy computations
if __name__ == "__main__":
inputs = [large_dataset] * 8
processes = []
for input_data in inputs:
p = Process(target=cpu_bound_worker, args=(input_data,))
processes.append(p)
p.start()
# Rest of main process code executes in parallel
asyncio
This module provides infrastructure for writing asynchronous code using async/await syntax. It is well-suited for tasks involving network I/O and concurrency:
import asyncio
async def fetch_data(url):
# Async HTTP request
response = await aiohttp.request(url)
return response
async def main():
urls = [url1, url2, url3]
tasks = []
for url in urls:
tasks.append(fetch_data(url))
results = await asyncio.gather(*tasks)
asyncio.run(main())
Asyncio helps build highly performant network apps by efficiently handling thousands of concurrent connections.
concurrent.futures
This high-level module abstracts thread and process pools for executing callables asynchronously:
with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor:
futures = [executor.submit(cpu_bound_fn, arg) for arg in args]
results = [f.result() for f in futures]
The Future objects provide a clean interface to wait for and retrieve results. The module handles pooling and concurrency under the hood.
Together these approaches enable Python developers to speed up programs through parallelism, utilize multiple cores, and handle thousands of concurrent connections.
Metaprogramming with Metaclasses
While classes in Python enable creating multiple objects, metaclasses allow you to customize how the classes themselves are constructed and modified at a meta level.
Metaclasses intercept class creation and modify the class before it's finalized. For example, automatically registering models in a registry, applying mixins, interfacing with ORMs, etc.
To use a metaclass, define a base type and override __new__ and __init__ methods:
class RegistryMeta(type):
registry = {}
def __new__(cls, name, bases, attrs):
# Modify attrs
attrs['id'] = len(RegistryMeta.registry)
# Build class as normal
klass = type.__new__(cls, name, bases, attrs)
# Register class
RegistryMeta.registry[klass.id] = klass
return klass
def __init__(self, name, bases, attrs):
print(f"Initializing {name} class")
super().__init__(name, bases, attrs)
Any class inheriting this metaclass will be intercepted and registered:
class Base(metaclass=RegistryMeta):
pass
class Person(Base):
pass
print(RegistryMeta.registry)
# {0: <class '__main__.Base'>, 1: <class '__main__.Person'>}
Metaclasses open up powerful metaprogramming capabilities and customization hooks to Python's class construction process.
Dynamic Attribute Access
Unlike statically typed languages, Python enables objects to have attributes added dynamically at runtime, beyond those explicitly defined in __init__ or elsewhere in the class.
For example:
class Point:
def __init__(self, x, y):
self.x = x
self.y = y
p = Point(2, 3)
p.z = 5 # New attribute created dynamically
print(p.z)
# Outputs 5
This dynamic behavior can be useful in certain cases like:
Implementing caching as attributes
Lazily creating attributes only when accessed
Proxy or delegate classes that reroute attribute access
Dynamic mixins that add capabilities to classes
However, it can also make code harder to understand and trace since attributes aren't explicitly defined beforehand.
Python supports the __slots__ magic method to restrict this behavior. It tells Python only to allow attributes listed there and raise AttributeError for anything else:
class Point:
__slots__ = ['x', 'y']
p = Point(2, 3)
p.z = 5 # AttributeError!
So __slots__ prevents surprise behaviors from dynamic attributes.
Descriptor Protocol
This advanced protocol provides the underlying mechanics of how attributes like properties and methods work in Python.
Descriptors essentially control attribute access on objects. They are implemented as classes containing __get__, __set__ and __delete__ methods.
For example, the @property decorator works through descriptors:
class Property:
def __init__(self, fget):
self.fget = fget
def __get__(self, obj, owner):
return self.fget(obj)
class Point:
def __init__(self, x):
self.x = x
@Property
def y(self):
return self.x * 2
p = Point(10)
print(p.y) # Calls getter internally
Here, the Property descriptor class implements __get__ to call the underlying y method when accessed as an attribute.
Some other examples of descriptors:
@classmethod and @staticmethod to define different method types
@memoized to cache method return values
__slots__ to restrict attributes
ORM frameworks mapping DB rows to Python objects
The descriptor protocol is a key ingredient that enables much of Python's magic like @property, class methods, static methods, etc. Mastering descriptors unlocks deeper capabilities.
Context Managers
While the with statement provides easy file cleanup logic, Python enables implementing the same pattern for any object through context managers.
For example, acquiring and releasing a lock or database connection:
class Resource:
def __enter__(self):
print("Acquiring resource")
def __exit__(self, exc_type, exc_val, exc_tb):
print("Releasing resource")
with Resource() as resource:
# Use resource
This ensures reliable cleanup through the __exit__ method. Context managers can also suppress exceptions during cleanup:
class Resource:
# ...
def __exit__(self, exc_type, exc_val, exc_tb):
print("Handling exception")
# suppress exception by returning True
return True
The contextlib module provides utilities like @contextmanager to simplify creating context managers.
Some real-world examples include:
File opening
Lock acquiring/release
Database connections
Temporary directory handling
Logging redirected to buffer
Context managers provide a robust way to handle resources in Python.
Unit Testing
Writing tests is vital for validating code quality and ensuring proper behavior as the codebase grows. Python comes with a built-in unittest framework for authoring and running unit tests.
The key components are test case classes, individual test methods, assertions, and test runners:
import unittest
class UserTestCase(unittest.TestCase):
# Setup run before each test method
def setUp(self):
self.user = User("John", "Doe")
def test_full_name(self):
self.assertEqual(self.user.full_name(), "John Doe")
def test_initials(self):
self.assertEqual(self.user.initials(), "J.D")
@unittest.expectedFailure
def test_send_email(self):
self.user.send_email("test@example.com")
if __name__ == "__main__":
unittest.main() # Run all tests
This allows for organizing related tests into reusable test cases. Functionality like fixtures, assertions, mocking, and test runners handle the testing workflow.
Some other Python testing tools include pytest for a streamlined experience and mocks for isolating code dependencies.
Thorough testing improves code quality and reduces bugs in the long run. Tests empower developers to refactor and iterate rapidly.
Generators and Iterators
Generators allow pausing and resuming execution to produce a sequence of values individually through iteration lazily. This is useful for:
Dealing with large datasets without loading everything into memory
Implementing streams and efficient pipelines
Avoiding eager allocation of resources until needed
In Python, generators are defined using yield instead of return:
def num_sequence(n):
for i in range(n):
yield i
seq = num_sequence(3)
print(next(seq)) # 0
print(next(seq)) # 1
When called, generators return a generator object that supports the iteration protocol. Lazy iteration enables efficient streaming:
def read_log_file(file):
for line in open(file):
yield process(line)
for event in read_log_file("logs.txt"):
print(event)
Generators allow implementation of Python iterators elegantly. Popular libraries like Django ORM use them extensively for lazy querying.
Asynchronous Programming
Python 3.5 introduced async/await syntax to natively support asynchronous code using asyncio:
import asyncio
async def fetch(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
return await response.text()
async def main():
urls = ["url1", "url2", "url3"]
tasks = []
for url in urls:
tasks.append(fetch(url))
results = await asyncio.gather(*tasks)
print(results)
asyncio.run(main())
Asyncio provides an event loop to orchestrate concurrent tasks and asynchronous I/O efficiently. It is well suited for highly parallel network programs.
Python Typing
Type hints allow adding static types to function arguments and return values for static analysis:
from typing import List
def sum_numbers(nums: List[int]) -> int:
return sum(nums)
This metadata enables better error-catching, IDE autocompletion, and documentation. The code still runs dynamically, as usual.
Popular third-party tools like MyPy leverage these type hints to provide optional static type checking for Python. Typing brings some of the benefits of static languages to Python.
Python Packaging
Python code is typically organized into modules or packages. The Python Packaging Index (PyPI) contains thousands of open-source packages with functionality beyond the standard library.
Some best practices for structuring Python code for others to use:
Setup project structure with src and tests folders
Write setup.py script for pip installation
Include requirements.txt with dependencies
Follow Semantic Versioning for releases
Upload package to PyPI for public sharing
Tools like Poetry, Flit, and pipenv streamline handling dependencies and packaging. Sharing reusable packages enables collective code reuse in Python.
Conclusion
This concludes our advanced guide to modern Python. We covered important techniques like:
Object-oriented programming principles
Metaprogramming with metaclasses and decorators
Improving performance through concurrency
Robust resource handling with context managers
Unit testing and maintaining code quality
Lazy generation of sequences for efficiency
Asynchronous I/O handling
Static type hints
Python packaging ecosystem
These capabilities enable large, robust applications, libraries, and frameworks to be developed in Python.
The key is always to keep learning through documentation, books, open-source code, conferences, and trying out ideas. Python has an amazing community.
I hope you enjoyed reading this guide and feel motivated to start your Python programming journey.
If you find this post exciting, find more exciting posts on Learnhub Blog; we write everything tech from Cloud computing to Frontend Dev, Cybersecurity, AI, and Blockchain.