I’ve been following the 100 Days of SwiftUI course on Hacking With Swift, Paul Hudson’s excellent swift resource site. On the slack channel that accompanies the site a couple of users were asking if the 100 Days index could show subject topics for each day. This is my attempt to crowdsource it.
I suggested that was something that could be crowdsourced by those going through the course and therefore not become added work for Paul to do. This is my take on what the course covers.
SwiftUI Index
The index starts at day 16 because the preceding days introduce basic swift language topics, for readers who have not previously used the Swift programming language.
Day 16
Introduction to Form, NavigationView and @State the introduction also touches on using loops in SwiftUI’s to populate lists or Pickers.
Day 17
Now we are starting to build the WeSplit App that was introduced in Day 16. This day covers the Form, TextField and @State binding to allow user input. We again use a loop to populate a picker. Then learn that to use a picker in a Section the use of a NavigationView is necessary to allow screens to slide in. Next we meet the Segmented Control for the first time in a build and the ability to use a Section Header to explain the section.
The project also uses a computed property to do some math inside the view.
Day 20
Tomorrow we start our second project and today we are introduced to Stacks (HStack, VStack and ZStack). How Colors work as and with Views. Using Gradients, Buttons and Images and how to show an Alert.
Day 21
We build our UI with the use of a ZStack and several VStacks. We use a ForEach loop to create our Buttons. This project gives us practise with creating UI and using the @State property wrapper so we can manipulate our variables. We also use some modifiers to make changes to the Image shape of the Buttons using .clipShape, .overlay and .shadow
Day 23
Day 23 starts to look at Views and Modifiers in a bit more depth and explores modifier order, conditional modifiers, views as properties and custom modifiers and containers. Paul explains why a view’s size is smaller than you would expect and ways to fill the screen without breaking your code. Also what the heck some View actually is.
Day 26
After some consolidation and a challenge which ask’s you to build a Rock, Paper, Scissors App using what you have learnt so far today we are introduced to the Stepper, DatePicker and DateFormatter and use Bindings to work with the form elements. We also use navigationBarItems() to round out our UI.
Day 27
This is a great example of how elements will change their presentation depending on their parent struct. With a change from VStack to Form our user interface feels more professional. As a bonus, we even explore some machine learning in this project.
Day 29
Another game project today and we start to work with Lists. Also how to load resources in from a file within our Bundle.
Day 32
We start to learn about animations in SwiftUI. How to create implicit animations, how to customise them. Animating binding and creating explicit animations.
Day 33
Taking our understanding of animations a bit further Paul teaches us about the animation stack (how the order of an animation modifier affects the animation and how multiple animation modifiers can be used), how to animate gestures. How to show or hide a view with transitions. Also how we can build a custom transition.
Day 36
After some challenges and a milestone project, we move on to a more complex App. Today we meet @ObservedObject and see how it allows classes to be used to pass data around multiple screens. @ObservedObject is used in conjunction with @ObservableObject and @Published to replicate the @State behaviour for classes and their properties.
We also meet .sheet which is a way to show a second view. We also see how @Environment(\.presentationMode) allows us access to programmatically dismiss the view.
Paul shows us how .onDelete is SwiftUI’s control mechanism for deleting items in a list, specifically a list which uses ForEach. We also touch on using UserDefaults to store simple data between App launches and using Codable for more complex types.
Day 39
This next project introduces GeometryReader, ScrollView and NavigationLink. We revisit using Codable and of course List and Text which most UIs will include in some form or another.
Paul also shows how working with Images can be trickier than you think and how .resizable() and .aspectRatio() can help us render our images in the ways we would expect and how using GeometryReader can query how big our container view is.
We look at the basics of ScrollView and how views added to a ScrollView will be created immediately. There is a demonstration of how a List will use fewer resources as it lazily creates the views whereas the ScrollView creates them all at once.
Paul then explores NavigationView and how this view creates a ‘stack’ Which allows views to be pushed onto it. We can achieve this via a NavigationLink. He discusses the difference between using this and .sheet().
Day 40
In order to retrieve data from text files in our App we write an extension to Bundle which decodes the JSON data. We see how Codable can use hierarchical structs to decode data. Paul also shows us a use for Generics which allows us to keep our code D.R.Y (do not repeat yourself). We also see different options for turning dates in JSON into human readable formatted dates using dateDecodingStrategy or a DateFormatter()
Day 41
Now we start to build a second view in the App for our missions. We use an additional swiftUI file to achieve this rather than putting all our code in contentView.swift. We get to use GeometryReader to help us size our Images and we see how to make sure our PreviewPovider can work even though we are now initialising our view with external data.
Day 43
Today we are introduced to drawing in SwiftUI. We meet Shape, Path, and InsettableShape to investigate how each can produce subtly different results. Paul shows how we can create our own structs to make customised shapes more reuseable.
Day 44
Today we are introduced to some extra drawing tools. The first being CGAffineTransform
. We then use ImagePaint to play around with custom borders. Most drawing in SwiftUI uses CoreAnimation but this last technique shows that sometimes CoreAnimation is not fast enough and that you may need to drop down to using the Metal framework via drawingGroup
Day 45
Some more graphics work today. We learn about .blendMode and how the option of .multiply can be used to tint images. and how .screen allows shapes to interact with each other to produce some nice effects. We also take a look at using Animations with Shapes and see the adding withAnimation alone will not produce the animation we would expect. We, therefore, have to use an AnimatableData property in our struct to overcome this.
Day 49
We return to looking at data with a deeper look at Codable. How adding @Published to a property in a class requires us to delve into Encoder / Decoder methods. We make a quick App that can send a request to itunes.com and request a list of tracks which are then displayed in a list after decoding the response from the server.
Day 50
Today we start to build an App to order Cupcakes. We revisit building forms and see how using the @Published wrapper allows us to pass our data between views. We also look at how to ensure a form cn not able to be submitted until all the required fields have been entered using the .disabled modifier.
Day 51
Back to Codable and how we can use it to send information to a server. We have to deal with the fact that Swift will not automatically encode/decode the object as we have used the @Published property wrapper. To this end, we write two methods and use an enum with our CodingKeys to overcome this. We then use URLSession to send our encoded order to a server and decode the response we receive.
Day 53
After a day of challenges yesterday we move on to the Bookworm App. Here we start to explore the use of @Binding to ensure data updates across structs. We peek at size classes and how we can action them via the @Environment object. I like how Paul uses different types of stacks depending on the width of the view and to do this he uses AnyView which means he uses type erasure. We then start our journey in understanding CoreData and how it works with SwiftUI
Day 54
Today we really start to get to work on CoreData. We create our Entity and use FetchedResults to start working with the data model. We make a lovely custom view to display a rating as a series of stars. The we output our list using the FetchedResults from Core Data.
Day 57
Paul explains more about \.self and how it uses hashable to create a representation of the object.We look at a way to overcome CoreData‘s insistence that properties should be optional by creating the class definition manually using a NSManagedObject subclass. We are introduced to the concept of checking for changes before issuing a save in CoreData using the .hasChanges property.
Day 58
I had to download Xcode 11.7 to follow along with this day. For some reason, Xcode 12 added lots of extra code to a blank CoreData project and I couldn’t follow along with this project. Hopefully, this will be updated in a future Xcode release.
Today we explore how NSPredicate can return filtered results from CoreData and syntax is a little obscure. We then explore how to filter dynamically using a second SwiftUI view and create a custom init() to run the fetchRequest.
Day 62
After a couple of days working on challenges, it is time to explore working with UIKIt in SwiftUI. Today Paul explains property wrappers to us in more detail and how to use Custom Bindings inside the body of the View in order to run code when a property’s value changes.
We then move on to explore the .actionsheet modifier to allow us to present multiple options in our alerts.
Day 63
Today we start to look at how to integrate CoreImage with SwiftUI and how to display UIKit views in our SwiftUI code. In this case, we are looking at choosing photos from the users PhotoLibrary with UIImagePickerController() using UIViewControllerRepresentable protocol.
Day 64
We explore how UIKit uses delegates and learn about coordinators in the context of SwiftUI. UIKit uses delegates to send instructions outside of its classes and the coordinators approach allows SwiftUI to keep track of what is happening inside UIKit code. We then wade into the mar that is Objective C so we can call a UIKit method to save our manipulated image. We need to get our hands dirty with #selector and @objc markers.
Day 65
Today we start to build the Instafilter App from scratch. Paul firstly introduces us to a way to show different Views conditionally using a simple if statement. We also work through using UIViewControllerRepresentable to present our ImagePicker via UIKit again. We employ a Custom Binding to allow a function to be called when the intensity property changes.
Day 66
We introduce an ActionSheet to our view to let the user pick a filter type. We then create a class to encapsulate our image saving using the objc and #selector code from objective-c. Paul also shows how to pass in a closure to this class in order to receive feedback and errors via completionHandlers.
Day 68
After another wrap up day we now start on the next project. Today we look at how we can tell Swift to compare our custom objects. That is how a struct with multiple properties should be compared to find out how to sort them. This is achieve via the Comparable conformance and a static function called <. We learn how to find the location of our App’s .documentDirectory. We then look at switching our views based on the value of an enum.
Day 70
Today we put the techniques we have learned together to create an App that can store MKPointAnnotations and display them with our MKMapView within a SwiftUI view. We learn about dequeuing AnnotionViews in order to use memory wisely. We see how we can send interactions in the UIKit MapView to our SwiftUI code to trigger code in our ContentView.
Day 71
We write an extension to MKPointAnnotation to provide unwrapped versions of its title and subtitle properties. We also create an edit View so that the user can update this information. We then use URLSession to load data from Wikipedia for the location tapped on the map.
Day 72
Today we create our methods to write and read our places to a local file on the device. Paul also shows how you can add Codable to someone else’s class. We also see how easy it is to encrypt the file by the use of
Day 74
Accessibility is the topic today. Paul introduces us to the SwiftUI way of making our views and data work better with systems like VoiceOver. We look at adding and removing .accessibility traits to make the narration of our view make more sense. Also how to group and hide elements that are not adding to the user experience.
Day 75
Today Paul takes us back to some of our earlier projects and shows how they fail in some way with regard to accessibility. We then look at the fixes to make the Apps useful. These are so easy to add, that it really does make you think about how easy it is to make our Apps accessibility in SwiftUI.
Day 79
After a couple of review and challenge days, we get back into SwiftUI proper. Today Paul covers techniques involving the @EnvironmentObject how it differs from @StateObject and how it allows child views to access the object even if the immediately preceding parent view has not accessed or passed the object forward. We also take a look at TabView and how to build tabs and set the image and text on a .tabItem. Paul notes that TabView should always be the parent view and that NavigationViews should sit inside a tab.
Day 81
We look at creating context menus and how to schedule a local notification. Paul also introduces us to adding a Swift Package to Xcode.
Day 82
We start to build our next App ‘Hot Prospects’ today. We start by building out our interface using a TabView. We create a model using a class which conforms to ObservableObject. We then provide this model to our views by adding @EnvironmentObject to our ContentView. We then use a filter() to show a filtered selection from our array.
Day 83
We work on the MeView today and generate a QRCode which encodes a name and email address. We meet the .textContentType for the TextField view which allows us to aid input by providing the appropriate keyboard and hints to the user at input. We use the Package Manager to download a package Paul has written to scan QRcodes and use this read in codes and add people to our array.
Day 84
We save our prospects to UserDefaults and Paul shows us a method of avoiding typos that the stringly typed UserDefaults system used, by using a static property. We also look at the use of access control by marking properties and methods private or fileprivate. This ensures that our code must utilise the methods inside the model and helps avoid bugs. We then create a local notification using UNUserNotificationCenter.
Day 86
We explore gestures in SwiftUI. We are introduced to gestures like the DragGesture and the LongPressGesture. We learn that gestures have a hierarchy and that they can clash if two views have different gestures assigned to them. But also how flexible these can be and how they can be combined to make interesting effects.
We then explore Haptic feedback and see that we can create our own haptic feedback by importing CoreHaptics. Then we look at user interactivity and how we can use allowsHitTesting() and contentShape() modifiers to change which parts of a view respond to taps.
Day 87
Today we look at Timers and how to .publish and .cancel() them. Paul introduces us to the Notification Center and how we can be notified of events like when our application enters the background using UIApplication.willResignActiveNotification. Finally, for today we look at how the @Environment object can provide us with knowledge of the user’s accessibility settings. This will allow us to make changes in our code to support options like .accessibilityDifferentiateWithoutColor, .accessibilityReduceMotion and .accessibilityReduceTransparency.
Day 88
We create a SwiftUI view to represent our CardView and learn how to tell Xcode to run our App only in Landscape mode, using the project settings. We then build a stack of cards using offsets. We use DragGesture() to manipulate our stack of cards using the .rotationEffect and .offset modifiers. We these small code elements we build a UI that feels very responsive to our fingers. We also add the withAnimation block to animate the removal of a card from the stack.
Day 89
Today we remind ourselves about accessibility and how our App can alter the visual feedback it provides to users who have .accessibilityDifferentiateWithoutColor set. We use a Timer and take action if our App is is no longer the foreground App using NotificationCenters UIApplication.willResignActiveNotification and .willEnterForegroundNotification to pause our timer. We also explore how we can use the allowsHitTesting() modifier to halt the game when the timer runs out.
Day 90
Today we add Haptics to our App, to provide physical feedback to our drag gesture. We find that it is good practice to prepare the Haptic Engine by issuing a prepare() command before actually playing haptic feedback. This will ensure that the haptics fires when you want and not get delayed thereby giving feedback at an incorrect time.
Day 92
Paul explains how layout and geometry work in SwiftUI. The use of alignment and alignment guides and how we can make our own custom layout guide.
Day 93
We explore absolute positioning in SwiftUI. We then take a deeper look at GeometryReader and how frames and coordinates behave inside one. Paul touches on .coordinateSpace today. Paul then shows how interesting effects can be obtained by reading values from a GeometryReader inside of a scrollView.
Reading more SwiftUI articles
Leave a Reply