OiiPDF.com

Concurrency in .NET

Title Concurrency in .NET
Author Riccardo Terrell
Sub Title Modern patterns of concurrent and parallel programming
ISBN 1617292990
Year 2018
Pages 500
File Formate PDF
Tags ASP.NET

Book Description

Functional languages help developers support concurrency by encouraging immutable data structures that can be passed between threads without having to worry about a shared state, all while avoiding side effects.
Concurrency in .NET teaches readers how to build concurrent and scalable programs in .NET using the functional paradigm. This intermediate-level guide is aimed at developers, architects, and passionate computer programmers.

Table Of Content

  • Concurrency in .NET
  • brief contents
  • contents
  • preface
  • acknowledgments
  • about this book
  • about the author
  • about the cover illustration
  • Part 1: Benefits of functional programming applicable to concurrent programs
    • 1 Functional concurrency foundations
    • 1.1 What you’ll learn from this book
    • 1.2 Let’s start with terminology
    • 1.2.1 Sequential programming performs one task at a time
    • 1.2.2 Concurrent programming runs multiple tasks at the same time
    • 1.2.3 Parallel programming executes multiples tasks simultaneously
    • 1.2.4 Multitasking performs multiple tasks concurrently over time
    • 1.2.5 Multithreading for performance improvement
    • 1.3 Why the need for concurrency?
    • 1.3.1 Present and future of concurrent programming
    • 1.4 The pitfalls of concurrent programming
    • 1.4.1 Concurrency hazards
    • 1.4.2 The sharing of state evolution
    • 1.4.3 A simple real-world example: parallel quicksort
    • 1.4.4 Benchmarking in F#
    • 1.5 Why choose functional programming for concurrency?
    • 1.5.1 Benefits of functional programming
    • 1.6 Embracing the functional paradigm
    • 1.7 Why use F# and C# for functional concurrent programming?
    • 2 Functional programming techniques for concurrency
    • 2.1 Using function composition to solve complex problems
    • 2.1.1 Function composition in C#
    • 2.1.2 Function composition in F#
    • 2.2 Closures to simplify functional thinking
    • 2.2.1 Captured variables in closures with lambda expressions
    • 2.2.2 Closures in a multithreading environment
    • 2.3 Memoization-caching technique for program speedup
    • 2.4 Memoize in action for a fast web crawler
    • 2.5 Lazy memoization for better performance
    • 2.5.1 Gotchas for function memoization
    • 2.6 Effective concurrent speculation to amortize the cost of expensive computations
    • 2.6.1 Precomputation with natural functional support
    • 2.6.2 Let the best computation win
    • 2.7 Being lazy is a good thing
    • 2.7.1 Strict languages for understanding concurrent behaviors
    • 2.7.2 Lazy caching technique and thread-safe Singleton pattern
    • 2.7.3 Lazy support in F#
    • 2.7.4 Lazy and Task, a powerful combination
    • 3 Functional data structures and immutability
    • 3.1 Real-world example: hunting the thread-unsafe object
    • 3.1.1 .NET immutable collections: a safe solution
    • 3.1.2 .NET concurrent collections: a faster solution
    • 3.1.3 The agent message-passing pattern: a faster, better solution
    • 3.2 Safely sharing functional data structures among€threads
    • 3.3 Immutability for a change
    • 3.3.1 Functional data structure for data parallelism
    • 3.3.2 Performance implications of using immutability
    • 3.3.3 Immutability in C#
    • 3.3.4 Immutability in F#
    • 3.3.5 Functional lists: linking cells in a chain
    • 3.3.6 Building a persistent data structure: an immutable binary tree
    • 3.4 Recursive functions: a natural way to iterate
    • 3.4.1 The tail of a correct recursive function: tail-call optimization
    • 3.4.2 Continuation passing style to optimize recursive function
  • Part 2: How to approach the different parts of a concurrent program
    • 4 The basics of processing big data: data parallelism, part 1
    • 4.1 What is data parallelism?
    • 4.1.1 Data and task parallelism
    • 4.1.2 The “embarrassingly parallel” concept
    • 4.1.3 Data parallelism support in .NET
    • 4.2 The Fork/Join pattern: parallel Mandelbrot
    • 4.2.1 When the GC is the bottleneck: structs vs. class objects
    • 4.2.2 The downside of parallel loops
    • 4.3 Measuring performance speed
    • 4.3.1 Amdahl’s Law defines the limit of performance improvement
    • 4.3.2 Gustafson’s Law: a step further to measure performance improvement
    • 4.3.3 The limitations of parallel loops: the sum of prime numbers
    • 4.3.4 What can possibly go wrong with a simple loop?
    • 4.3.5 The declarative parallel programming model
    • 5 PLINQ and MapReduce: data parallelism, part 2
    • 5.1 A short introduction to PLINQ
    • 5.1.1 How is PLINQ more functional?
    • 5.1.2 PLINQ and pure functions: the parallel word counter
    • 5.1.3 Avoiding side effects with pure functions
    • 5.1.4 Isolate and control side effects: refactoring the parallel word€counter
    • 5.2 Aggregating and reducing data in parallel
    • 5.2.1 Deforesting: one of many advantages to folding
    • 5.2.2 Fold in PLINQ: Aggregate functions
    • 5.2.3 Implementing a parallel Reduce function for PLINQ
    • 5.2.4 Parallel list comprehension in F#: PSeq
    • 5.2.5 Parallel arrays in F#
    • 5.3 Parallel MapReduce pattern
    • 5.3.1 The Map and Reduce functions
    • 5.3.2 Using MapReduce with the NuGet package gallery
    • 6 Real-time event streams: functional reactive programming
    • 6.1 Reactive programming: big event processing
    • 6.2 .NET tools for reactive programming
    • 6.2.1 Event combinators—a better solution
    • 6.2.2 .NET interoperability with F# combinators
    • 6.3 Reactive programming in .NET: Reactive Extensions (Rx)
    • 6.3.1 From LINQ/PLINQ to Rx
    • 6.3.2 IObservable: the dual IEnumerable
    • 6.3.3 Reactive Extensions in action
    • 6.3.4 Real-time streaming with RX
    • 6.3.5 From events to F# observables
    • 6.4 Taming the event stream: Twitter emotion analysis using Rx programming
    • 6.4.1 SelectMany: the monadic bind operator
    • 6.5 An Rx publisher-subscriber
    • 6.5.1 Using the Subject type for a powerful publisher-subscriber hub
    • 6.5.2 Rx in relation to concurrency
    • 6.5.3 Implementing a reusable Rx publisher-subscriber
    • 6.5.4 Analyzing tweet emotions using an Rx Pub-Sub class
    • 6.5.5 Observers in action
    • 6.5.6 The convenient F# object expression
    • 7 Task-based functional parallelism
    • 7.1 A short introduction to task parallelism
    • 7.1.1 Why task parallelism and functional programming?
    • 7.1.2 Task parallelism support in .NET
    • 7.2 The .NET Task Parallel Library
    • 7.2.1 Running operations in parallel with TPL Parallel.Invoke
    • 7.3 The problem of void in C#
    • 7.3.1 The solution for void in C#: the unit type
    • 7.4 Continuation-passing style: a functional control flow
    • 7.4.1 Why exploit CPS?
    • 7.4.2 Waiting for a task to complete: the continuation model
    • 7.5 Strategies for composing task operations
    • 7.5.1 Using mathematical patterns for better composition
    • 7.5.2 Guidelines for using tasks
    • 7.6 The parallel functional Pipeline pattern
    • 8 Task asynchronicity for the win
    • 8.1 The Asynchronous Programming Model (APM)
    • 8.1.1 The value of asynchronous programming
    • 8.1.2 Scalability and asynchronous programming
    • 8.1.3 CPU-bound and I/O-bound operations
    • 8.2 Unbounded parallelism with asynchronous programming
    • 8.3 Asynchronous support in .NET
    • 8.3.1 Asynchronous programming breaks the code structure
    • 8.3.2 Event-based Asynchronous Programming
    • 8.4 C# Task-based Asynchronous Programming
    • 8.4.1 Anonymous asynchronous lambdas
    • 8.4.2 Task is a monadic container
    • 8.5 TAP: a case study
    • 8.5.1 Asynchronous cancellation
    • 8.5.2 Task-based asynchronous composition with the monadic Bind€operator
    • 8.5.3 Deferring asynchronous computation enables composition
    • 8.5.4 Retry if something goes wrong
    • 8.5.5 Handling errors in asynchronous operations
    • 8.5.6 Asynchronous parallel processing of the historical stock market
    • 8.5.7 Asynchronous stock market parallel processing as tasks complete
    • 9 Asynchronous functional programming in F#
    • 9.1 Asynchronous functional aspects
    • 9.2 What’s the F# asynchronous workflow?
    • 9.2.1 The continuation passing style in computation expressions
    • 9.2.2 The asynchronous workflow in action: Azure Blob storage parallel€operations
    • 9.3 Asynchronous computation expressions
    • 9.3.1 Difference between computation expressions and monads
    • 9.3.2 AsyncRetry: building your own computation expression
    • 9.3.3 Extending the asynchronous workflow
    • 9.3.4 Mapping asynchronous operation: the Async.map functor
    • 9.3.5 Parallelize asynchronous workflows: Async.Parallel
    • 9.3.6 Asynchronous workflow cancellation support
    • 9.3.7 Taming parallel asynchronous operations
    • 10 Functional combinators for fluent concurrent programming
    • 10.1 The execution flow isn’t always on the happy path: error handling
    • 10.1.1 The problem of error handling in imperative programming
    • 10.2 Error combinators: Retry, Otherwise, and Task.Catch in C#
    • 10.2.1 Error handling in FP: exceptions for flow control
    • 10.2.2 Handling errors with Task> in C#
    • 10.2.3 The F# AsyncOption type: combining Async and Option
    • 10.2.4 Idiomatic F# functional asynchronous error handling
    • 10.2.5 Preserving the exception semantic with the Result type
    • 10.3 Taming exceptions in asynchronous operations
    • 10.3.1 Modeling error handling in F# with Async and Result
    • 10.3.2 Extending the F# AsyncResult type with monadic bind operators
    • 10.4 Abstracting operations with functional combinators
    • 10.5 Functional combinators in a nutshell
    • 10.5.1 The TPL built-in asynchronous combinators
    • 10.5.2 Exploiting the Task.WhenAny combinator for redundancy and€interleaving
    • 10.5.3 Exploiting the Task.WhenAll combinator for asynchronous for-each
    • 10.5.4 Mathematical pattern review: what you’ve seen so far
    • 10.6 The ultimate parallel composition applicative functor
    • 10.6.1 Extending the F# async workflow with applicative€functor€operators
    • 10.6.2 Applicative functor semantics in F# with infix operators
    • 10.6.3 Exploiting heterogeneous parallel computation with€applicative€functors
    • 10.6.4 Composing and executing heterogeneous parallel computations
    • 10.6.5 Controlling flow with conditional asynchronous combinators
    • 10.6.6 Asynchronous combinators in action
    • 11 Applying reactive programming everywhere with agents
    • 11.1 What’s reactive programming, and how is it useful?
    • 11.2 The asynchronous message-passing programming€model
    • 11.2.1 Relation with message passing and immutability
    • 11.2.2 Natural isolation
    • 11.3 What is an agent?
    • 11.3.1 The components of an agent
    • 11.3.2 What an agent can do
    • 11.3.3 The share-nothing approach for lock-free concurrent€programming
    • 11.3.4 How is agent-based programming functional?
    • 11.3.5 Agent is object-oriented
    • 11.4 The F# agent: MailboxProcessor
    • 11.4.1 The mailbox asynchronous recursive loop
    • 11.5 Avoiding database bottlenecks with F# MailboxProcessor
    • 11.5.1 The MailboxProcessor message type: discriminated unions
    • 11.5.2 MailboxProcessor two-way communication
    • 11.5.3 Consuming the AgentSQL from C#
    • 11.5.4 Parallelizing the workflow with group coordination of agents
    • 11.5.5 How to handle errors with F# MailboxProcessor
    • 11.5.6 Stopping MailboxProcessor agents—CancellationToken
    • 11.5.7 Distributing the work with MailboxProcessor
    • 11.5.8 Caching operations with an agent
    • 11.5.9 Reporting results from a MailboxProcessor
    • 11.5.10 Using the thread pool to report events from MailboxProcessor
    • 11.6 F# MailboxProcessor: 10,000 agents for a game of life
    • 12 Parallel workflow and agent programming with TPL Dataflow
    • 12.1 The power of TPL Dataflow
    • 12.2 Designed to compose: TPL Dataflow blocks
    • 12.2.1 Using BufferBlock as a FIFO buffer
    • 12.2.2 Transforming data with TransformBlock
    • 12.2.3 Completing the work with ActionBlock
    • 12.2.4 Linking dataflow blocks
    • 12.3 Implementing a sophisticated Producer/Consumer with TDF
    • 12.3.1 A multiple Producer/single Consumer pattern: TPL Dataflow
    • 12.3.2 A single Producer/multiple Consumer pattern
    • 12.4 Enabling an agent model in C# using TPL Dataflow
    • 12.4.1 Agent fold-over state and messages: Aggregate
    • 12.4.2 Agent interaction: a parallel word counter
    • 12.5 A parallel workflow to compress and encrypt€a€large€stream
    • 12.5.1 Context: the problem of processing a large stream of data
    • 12.5.2 Ensuring the order integrity of a stream of messages
    • 12.5.3 Linking, propagating, and completing
    • 12.5.4 Rules for building a TDF workflow
    • 12.5.5 Meshing Reactive Extensions (Rx) and TDF
  • Part 3: Modern patterns of concurrent programming applied
    • 13 Recipes and design patterns for successful concurrent programming
    • 13.1 Recycling objects to reduce memory consumption
    • 13.1.1 Solution: asynchronously recycling a pool of objects
    • 13.2 Custom parallel Fork/Join operator
    • 13.2.1 Solution: composing a pipeline of steps forming the Fork/Join pattern
    • 13.3 Parallelizing tasks with dependencies: designing code to optimize performance
    • 13.3.1 Solution: implementing a dependencies graph of tasks
    • 13.4 Gate for coordinating concurrent I/O operations sharing resources: one write, multiple reads
    • 13.4.1 Solution: applying multiple read/write operations to shared thread-safe resources
    • 13.5 Thread-safe random number generator
    • 13.5.1 Solution: using the ThreadLocal object
    • 13.6 Polymorphic event aggregator
    • 13.6.1 Solution: implementing a polymorphic publisher-subscriber pattern
    • 13.7 Custom Rx scheduler to control the€degree€of€parallelism
    • 13.7.1 Solution: implementing a scheduler with€multiple€concurrent€agents
    • 13.8 Concurrent reactive scalable client/server
    • 13.8.1 Solution: combining Rx and asynchronous programming
    • 13.9 Reusable custom high-performing parallel filter‑map operator
    • 13.9.1 Solution: combining filter and map parallel operations
    • 13.10 Non-blocking synchronous message-passing model
    • 13.10.1 Solution: coordinating the payload between operations using the agent programming model
    • 13.11 Coordinating concurrent jobs using the agent programming model
    • 13.11.1 Solution: implementing an agent that runs jobs with a configured degree of parallelism
    • 13.12 Composing monadic functions
    • 13.12.1 Solution: combining asynchronous operations using€the€Kleisli€composition operator
    • 14 Building a scalable mobile app with concurrent functional programming
    • 14.1 Functional programming on the server in the real world
    • 14.2 How to design a successful performant application
    • 14.2.1 The secret sauce: ACD
    • 14.2.2 A different asynchronous pattern: queuing work for later execution
    • 14.3 Choosing the right concurrent programming model
    • 14.3.1 Real-time communication with SignalR
    • 14.4 Real-time trading: stock market high-level architecture
    • 14.5 Essential elements for the stock market application
    • 14.6 Let’s code the stock market trading application
    • 14.6.1 Benchmark to measure the scalability of the stock ticker application
  • appendix a Functional€programming
  • appendix b F# overview
  • appendix c Interoperability between an F# asynchronous workflow and .NET Task
  • index