Swift Camera App: An IOS Tutorial For Beginners

by Jhon Lennon 48 views

Hey guys! Ever wanted to build your own camera app using Swift for iOS? Well, you're in the right place! This tutorial will guide you through creating a basic camera application from scratch. We'll cover everything from setting up the UI to handling camera permissions and capturing photos. By the end of this guide, you’ll have a working camera app and a solid understanding of how to work with the camera in Swift.

Prerequisites

Before we dive in, make sure you have the following:

  • Xcode: You'll need Xcode, the IDE for developing iOS applications. You can download it from the Mac App Store.
  • Swift Knowledge: A basic understanding of Swift programming is essential.
  • iOS Device or Simulator: You can test your app on a physical iOS device or use the iOS Simulator that comes with Xcode.

Step 1: Creating a New Xcode Project

First, let's create a new Xcode project. Open Xcode and follow these steps:

  1. Click on "Create a new Xcode project."
  2. Choose "iOS" and select "App" template. Click "Next."
  3. Enter your project details:
    • Product Name: CameraApp (or whatever you prefer)
    • Organization Identifier: com.example (or your own identifier)
    • Interface: Storyboard
    • Language: Swift
  4. Click "Next" and choose a location to save your project. Click "Create."

Now you have a brand new Xcode project ready for coding!

Step 2: Designing the User Interface

Let's design the UI for our camera app. We'll need a UIImageView to display the camera preview and a UIButton to capture photos. Here's how to set it up in the Main.storyboard:

  1. Open Main.storyboard.
  2. Drag a UIImageView from the Object Library to the view. Resize it to fill most of the screen. This will display the camera feed.
  3. Add a UIButton at the bottom of the screen. Set its title to "Capture." This will trigger the photo capture.
  4. Add constraints to the UIImageView and UIButton to ensure they are properly positioned on different screen sizes. You can do this by using the Auto Layout features in Xcode.
  5. Create outlets and actions in your ViewController.swift file. Connect the UIImageView to an outlet named imageView and the UIButton to an action named captureButtonTapped.

Here’s the code you need to add to your ViewController.swift:

import UIKit

class ViewController: UIViewController {

    @IBOutlet weak var imageView: UIImageView!
    
    override func viewDidLoad() {
        super.viewDidLoad()
        // Additional setup will go here
    }
    
    @IBAction func captureButtonTapped(_ sender: UIButton) {
        // Capture button action will go here
    }
}

Step 3: Setting Up the Camera Session

Now comes the fun part – setting up the camera session! We'll use AVFoundation framework to access the camera. Add the following code to your ViewController.swift:

import AVFoundation

class ViewController: UIViewController {

    @IBOutlet weak var imageView: UIImageView!
    
    var captureSession: AVCaptureSession!
    var stillImageOutput: AVCapturePhotoOutput!
    var videoPreviewLayer: AVCaptureVideoPreviewLayer!

    override func viewDidLoad() {
        super.viewDidLoad()

        captureSession = AVCaptureSession()
        captureSession.sessionPreset = .medium

        guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video)
            else {
                print("Unable to access back camera!")
                return
        }

        do {
            let input = try AVCaptureDeviceInput(device: backCamera)
            stillImageOutput = AVCapturePhotoOutput()

            if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) {
                captureSession.addInput(input)
                captureSession.addOutput(stillImageOutput)
                setupLivePreview()
            }
        }
        catch let error  {
            print("Error unable to initialize back camera:  (error.localizedDescription)")
        }
    }
    
    func setupLivePreview() {
        videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        videoPreviewLayer.videoGravity = .resizeAspectFill
        videoPreviewLayer.connection?.videoOrientation = .portrait
        imageView.layer.addSublayer(videoPreviewLayer)

        DispatchQueue.global(qos: .userInitiated).async {
            self.captureSession.startRunning()
            DispatchQueue.main.async {
                self.videoPreviewLayer.frame = self.imageView.bounds
            }
        }
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        navigationController?.setNavigationBarHidden(true, animated: animated)
    }
        
    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        navigationController?.setNavigationBarHidden(false, animated: animated)
    }

    @IBAction func captureButtonTapped(_ sender: UIButton) {
        // Capture button action will go here
    }
}

Here’s what this code does:

  • Imports AVFoundation: This line imports the necessary framework for working with the camera.
  • Declares Variables: We declare variables for the capture session, image output, and video preview layer.
  • Sets Up Capture Session: In viewDidLoad(), we create an AVCaptureSession, set its preset, and get the back camera.
  • Adds Input and Output: We add the camera as an input and the image output to the capture session.
  • Sets Up Live Preview: The setupLivePreview() function creates a AVCaptureVideoPreviewLayer to display the camera feed in the UIImageView. It also starts the capture session in the background to avoid blocking the main thread.

Step 4: Handling Camera Permissions

To access the camera, you need to request permission from the user. Add the following code to your ViewController.swift:

override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)

    // Request camera authorization
    AVCaptureDevice.requestAccess(for: .video) { (granted) in
        if granted {
            // Permission granted, continue with camera setup
            DispatchQueue.main.async {
                self.setupLivePreview()
            }
        } else {
            // Permission denied, show alert
            print("Camera permission denied")
        }
    }
}

Also, you need to add a key to your Info.plist file to explain why you need camera access. Add the Privacy - Camera Usage Description key and provide a description, like "This app needs access to the camera to take photos."

Step 5: Capturing Photos

Now, let's implement the captureButtonTapped action to capture photos. Add the following code to your ViewController.swift:

@IBAction func captureButtonTapped(_ sender: UIButton) {
    let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
    stillImageOutput.capturePhoto(with: settings, delegate: self)
}

We also need to conform to the AVCapturePhotoCaptureDelegate protocol. Add the following extension to your ViewController.swift:

extension ViewController: AVCapturePhotoCaptureDelegate {
    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        guard let imageData = photo.fileDataRepresentation() else {
            return
        }

        let image = UIImage(data: imageData)
        UIImageWriteToSavedPhotosAlbum(image!, nil, nil, nil)
        print("Photo saved successfully!")
    }
}

This code captures a photo when the button is tapped and saves it to the user's photo library. Make sure to add Privacy - Photo Library Additions Usage Description to your Info.plist.

Step 6: Running Your App

Now you can run your app on an iOS device or simulator. Make sure to allow camera permissions when prompted. Tap the "Capture" button to take a photo, and it will be saved to your photo library.

Complete Code

Here’s the complete ViewController.swift code:

import UIKit
import AVFoundation

class ViewController: UIViewController {

    @IBOutlet weak var imageView: UIImageView!
    
    var captureSession: AVCaptureSession!
    var stillImageOutput: AVCapturePhotoOutput!
    var videoPreviewLayer: AVCaptureVideoPreviewLayer!

    override func viewDidLoad() {
        super.viewDidLoad()

        captureSession = AVCaptureSession()
        captureSession.sessionPreset = .medium

        guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video)
            else {
                print("Unable to access back camera!")
                return
        }

        do {
            let input = try AVCaptureDeviceInput(device: backCamera)
            stillImageOutput = AVCapturePhotoOutput()

            if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) {
                captureSession.addInput(input)
                captureSession.addOutput(stillImageOutput)
                setupLivePreview()
            }
        }
        catch let error  {
            print("Error unable to initialize back camera:  (error.localizedDescription)")
        }
    }
    
    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)

        // Request camera authorization
        AVCaptureDevice.requestAccess(for: .video) { (granted) in
            if granted {
                // Permission granted, continue with camera setup
                DispatchQueue.main.async {
                    self.setupLivePreview()
                }
            } else {
                // Permission denied, show alert
                print("Camera permission denied")
            }
        }
    }
    
    func setupLivePreview() {
        videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        videoPreviewLayer.videoGravity = .resizeAspectFill
        videoPreviewLayer.connection?.videoOrientation = .portrait
        imageView.layer.addSublayer(videoPreviewLayer)

        DispatchQueue.global(qos: .userInitiated).async {
            self.captureSession.startRunning()
            DispatchQueue.main.async {
                self.videoPreviewLayer.frame = self.imageView.bounds
            }
        }
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        navigationController?.setNavigationBarHidden(true, animated: animated)
    }
        
    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        navigationController?.setNavigationBarHidden(false, animated: animated)
    }

    @IBAction func captureButtonTapped(_ sender: UIButton) {
        let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
        stillImageOutput.capturePhoto(with: settings, delegate: self)
    }
}

extension ViewController: AVCapturePhotoCaptureDelegate {
    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        guard let imageData = photo.fileDataRepresentation() else {
            return
        }

        let image = UIImage(data: imageData)
        UIImageWriteToSavedPhotosAlbum(image!, nil, nil, nil)
        print("Photo saved successfully!")
    }
}

Conclusion

And there you have it! You've successfully created a basic camera app using Swift. This tutorial covered setting up the UI, handling camera permissions, capturing photos, and saving them to the photo library. This is just the beginning. You can extend this app by adding features like filters, zoom, and more. Happy coding, and keep building awesome iOS apps! Remember to always test your app thoroughly on different devices and iOS versions to ensure the best user experience. Keep experimenting and expanding on this base to create something truly unique!