Read File in Golang with Timeout (Handle Large Files, Scanner Errors & Real Examples)

Read File in Golang with Timeout (Handle Large Files, Scanner Errors & Real Examples)

Read File in Golang (Quick Cheat Sheet)

Common Methods to Read Files in Go

Reading files in Go can be done using multiple approaches depending on your use case, file size, and performance requirements.

I have already covered various methods to read file in this article Different methods to read file in GO

go
// Read entire file (simple)
data, err := os.ReadFile("file.txt")

// Read line by line (scanner)
scanner := bufio.NewScanner(file)

// Read using reader (more control)
reader := bufio.NewReader(file)

// Read in chunks (large files)
buf := make([]byte, 1024)
n, _ := file.Read(buf)

When to Use Each Method (Scanner vs Reader vs ReadFile)

MethodBest Use CaseProsCons
os.ReadFileSmall filesSimple, one-linerHigh memory usage
bufio.ScannerLine-by-line readingEasy, clean APIFails on long lines (token limit)
bufio.ReaderLarge/controlled readsFlexible, efficientMore code
file.Read() (chunks)Very large filesLow memory usageManual handling

Quick Recommendations:

  • Use os.ReadFile → when file size is small (<10MB)
  • Use bufio.Scanner → when processing line-by-line logs/configs
  • Use bufio.Reader → when you need more control over reading
  • Use chunk-based reading → for large files (GB size)
HINT
If you hit error like bufio.Scanner: token too long, switch to bufio.Reader instead.

Read File in Golang with Timeout

Read Entire File with Timeout (os.ReadFile + select)

To read a file with timeout in Go, run the file read operation inside a goroutine and use select with time.After to control how long the operation can wait.

go
package main

import (
    "fmt"
    "os"
    "time"
)

func main() {
    dataChannel := make(chan []byte, 1)

    go func() {
        data, err := os.ReadFile("file.txt")
        if err != nil {
            fmt.Println("error:", err)
            return
        }
        dataChannel <- data
    }()

    select {
    case res := <-dataChannel:
        fmt.Println(string(res))
    case <-time.After(2 * time.Second):
        fmt.Println("timeout")
    }
}

Explanation:

  • os.ReadFile reads the complete file content into memory.
  • A goroutine runs the file read separately from the main flow.
  • dataChannel sends the file content back to the main function.
  • select waits for either the file content or timeout.
  • time.After triggers timeout if reading takes more than 2 seconds.

This method is best for small files. If you want a separate beginner example, see read file into variable in Go.

Read File Line by Line with Timeout (bufio.Scanner)

For large files, reading line by line is more memory-efficient than loading the entire file at once. bufio.Scanner is useful for logs, config files, and simple text processing.

go
package main

import (
    "bufio"
    "fmt"
    "os"
    "time"
)

func main() {
    file, err := os.Open("file.txt")
    if err != nil {
        fmt.Println("error:", err)
        return
    }
    defer file.Close()

    scanner := bufio.NewScanner(file)
    timeout := time.After(2 * time.Second)

    for scanner.Scan() {
        select {
        case <-timeout:
            fmt.Println("timeout")
            return
        default:
            fmt.Println(scanner.Text())
            time.Sleep(500 * time.Millisecond)
        }
    }

    if err := scanner.Err(); err != nil {
        fmt.Println("scanner error:", err)
    }
}

Explanation:

  • os.Open opens the file without loading everything into memory.
  • bufio.Scanner reads one line at a time.
  • timeout := time.After(...) creates one timeout for the full read operation.
  • select checks whether timeout has already expired.
  • time.Sleep is only used here to simulate slow processing.

This works well with goroutines in Go when file processing is part of a larger concurrent workflow.

Read File with Timeout using bufio.Reader (More Control)

bufio.Reader gives more control than bufio.Scanner. It is useful when lines may be long or when you want custom reading logic.

go
package main

import (
    "bufio"
    "fmt"
    "io"
    "os"
    "time"
)

func main() {
    file, err := os.Open("file.txt")
    if err != nil {
        fmt.Println("error:", err)
        return
    }
    defer file.Close()

    reader := bufio.NewReader(file)
    timeout := time.After(2 * time.Second)

    for {
        select {
        case <-timeout:
            fmt.Println("timeout")
            return
        default:
            line, err := reader.ReadString('\n')
            if err == io.EOF {
                if line != "" {
                    fmt.Print(line)
                }
                return
            }
            if err != nil {
                fmt.Println("error:", err)
                return
            }
            fmt.Print(line)
            time.Sleep(500 * time.Millisecond)
        }
    }
}

Explanation:

  • bufio.Reader reads buffered data from the file.
  • ReadString('\n') reads until a newline character.
  • Unlike Scanner, Reader is better when lines are long.
  • Timeout prevents the read loop from running longer than expected.

If you need to convert Reader output into a string, see convert io.Reader to string in Go.


Handle Large Files with Timeout Efficiently

Read File in Chunks with Timeout

Chunk-based reading is useful for very large files because it reads fixed-size blocks instead of loading the entire file into memory.

go
package main

import (
    "fmt"
    "io"
    "os"
    "time"
)

func main() {
    file, err := os.Open("file.txt")
    if err != nil {
        fmt.Println("error:", err)
        return
    }
    defer file.Close()

    buf := make([]byte, 1024)
    timeout := time.After(2 * time.Second)

    for {
        select {
        case <-timeout:
            fmt.Println("timeout")
            return
        default:
            n, err := file.Read(buf)
            if err == io.EOF {
                return
            }
            if err != nil {
                fmt.Println("error:", err)
                return
            }
            fmt.Print(string(buf[:n]))
            time.Sleep(500 * time.Millisecond)
        }
    }
}

Explanation:

  • buf := make([]byte, 1024) creates a 1 KB buffer.
  • file.Read(buf) reads only one chunk at a time.
  • string(buf[:n]) prints only the bytes actually read.
  • This avoids high memory usage for large files.
  • Timeout keeps the operation controlled.

Avoid High Memory Usage while Reading Files

When reading large files, avoid os.ReadFile because it loads the full file into memory. Instead, use Scanner, Reader, or chunk-based reading.

go
file, err := os.Open("large.log")
if err != nil {
    return
}
defer file.Close()

buf := make([]byte, 4096)
for {
    n, err := file.Read(buf)
    if n > 0 {
        fmt.Print(string(buf[:n]))
    }
    if err != nil {
        break
    }
}

Explanation:

  • The file is opened as a stream.
  • Only 4096 bytes are read at a time.
  • Memory usage remains stable even for large files.
  • This is better for logs, backups, exports, and large data files.

For more patterns around communication between file-reading goroutines, see Go channels.

Stream File Processing in Go with Timeout

Streaming means processing data while reading it, instead of waiting for the full file to load.

go
package main

import (
    "bufio"
    "fmt"
    "os"
    "time"
)

func main() {
    file, err := os.Open("app.log")
    if err != nil {
        fmt.Println("error:", err)
        return
    }
    defer file.Close()

    scanner := bufio.NewScanner(file)
    timeout := time.After(3 * time.Second)

    for scanner.Scan() {
        select {
        case <-timeout:
            fmt.Println("timeout")
            return
        default:
            line := scanner.Text()
            fmt.Println("processing:", line)
        }
    }
}

Explanation:

  • bufio.Scanner reads one line at a time.
  • Each line is processed immediately.
  • The program does not wait for the full file.
  • Timeout avoids uncontrolled long-running processing.

This pattern is common in log processors, monitoring tools, and ingestion pipelines. For broader design, see Go concurrency concepts.


Common Errors and Fixes

Fix "bufio.Scanner: token too long" Error

This error occurs when a line in the file exceeds the default buffer size of bufio.Scanner (64 KB).

go
scanner := bufio.NewScanner(file)

for scanner.Scan() {
    fmt.Println(scanner.Text())
}

if err := scanner.Err(); err != nil {
    fmt.Println("error:", err)
}

Error:

text
bufio.Scanner: token too long

Why this happens:

  • bufio.Scanner has a default max token size of 64 KB
  • If a single line exceeds this limit → Scanner fails

Solution 1: Increase Scanner Buffer Size

go
scanner := bufio.NewScanner(file)

buf := make([]byte, 0, 1024*1024) // 1 MB buffer
scanner.Buffer(buf, 1024*1024)

for scanner.Scan() {
    fmt.Println(scanner.Text())
}

Explanation:

  • scanner.Buffer() increases maximum token size
  • Useful for moderately large lines

Solution 2 (Recommended): Use bufio.Reader instead

go
reader := bufio.NewReader(file)

for {
    line, err := reader.ReadString('\n')
    if err != nil {
        break
    }
    fmt.Println(line)
}

Why this works better:

  • bufio.Reader does NOT have token size limitations
  • More reliable for large files or long lines

When to use which:

  • Small/normal text → Scanner
  • Large lines / logs / JSON → Reader

Scanner vs Reader: Which One Should You Use?

Choosing between bufio.Scanner and bufio.Reader depends on your use case.

Featurebufio.Scannerbufio.Reader
Ease of useVery easyModerate
PerformanceGoodBetter
Large lines❌ Fails✅ Works
Custom parsingLimitedFlexible

Example: Scanner (simple and clean)

go
scanner := bufio.NewScanner(file)
for scanner.Scan() {
    fmt.Println(scanner.Text())
}

✔ Best for:

  • Config files
  • Logs with small lines

Example: Reader (more control)

go
reader := bufio.NewReader(file)

for {
    line, err := reader.ReadString('\n')
    if err != nil {
        break
    }
    fmt.Print(line)
}

✔ Best for:

  • Large files
  • Long lines
  • Streaming data

File Read Timeout Not Working (Debug Guide)

Sometimes timeout logic does not behave as expected when reading files.

Problem 1: Timeout inside loop keeps resetting

go
for {
    select {
    case <-time.After(2 * time.Second):
        fmt.Println("timeout")
        return
    default:
        // read file
    }
}

Why this fails:

  • time.After() is recreated in every loop iteration
  • Timeout never actually triggers properly

Solution: Define timeout once outside loop

go
timeout := time.After(2 * time.Second)

for {
    select {
    case <-timeout:
        fmt.Println("timeout")
        return
    default:
        // read file
    }
}

Problem 2: Blocking file read ignores timeout

go
data, _ := os.ReadFile("file.txt") // blocking call

Why this fails:

  • os.ReadFile is blocking
  • Timeout cannot interrupt it directly

Solution: Use goroutine + select

go
dataChannel := make(chan []byte)

go func() {
    data, _ := os.ReadFile("file.txt")
    dataChannel <- data
}()

select {
case res := <-dataChannel:
    fmt.Println(string(res))
case <-time.After(2 * time.Second):
    fmt.Println("timeout")
}

Problem 3: Goroutine leak after timeout

If timeout happens, goroutine may still continue running.

Solution: Use context.WithTimeout (best practice)

go
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Second)
defer cancel()

done := make(chan bool)

go func() {
    // simulate work
    time.Sleep(3 * time.Second)
    done <- true
}()

select {
case <-done:
    fmt.Println("completed")
case <-ctx.Done():
    fmt.Println("timeout:", ctx.Err())
}

Explanation:

  • context.WithTimeout cancels operation safely
  • Prevents resource leaks
  • Preferred in production systems

For deeper understanding, refer to Golang context usage.


Frequently Asked Questions

1. How to read a file with timeout in Golang?

To read a file with timeout in Golang, run the file operation inside a goroutine and use select with time.After or context.WithTimeout to limit execution time.

2. Why does bufio.Scanner give token too long error?

bufio.Scanner has a default 64KB token limit. If a line exceeds this size, it throws an error. You can fix it by increasing buffer size or switching to bufio.Reader.

3. What is the best way to read large files in Golang?

For large files, use bufio.Reader or read file in chunks instead of os.ReadFile to avoid high memory usage.

4. What is the difference between bufio.Scanner and bufio.Reader?

bufio.Scanner is easier to use for line-by-line reading but has size limitations, while bufio.Reader offers more control and works better for large data.

5. Why timeout is not working in Golang file read?

Timeout may not work if used incorrectly inside loops or with blocking calls. It should be implemented using goroutines with select or context.WithTimeout.

Summary

  • Reading files in Golang can be done using multiple methods such as os.ReadFile, bufio.Scanner, bufio.Reader, or chunk-based reading depending on file size and use case.
  • To implement timeout, file reading must be executed inside a goroutine and controlled using select with time.After or context.WithTimeout.
  • os.ReadFile is suitable for small files but not recommended for large files due to high memory usage.
  • bufio.Scanner is easy to use for line-by-line reading but has a default token size limitation (64 KB).
  • bufio.Reader provides more control and is better suited for large files or long lines.
  • Chunk-based reading is the most efficient method for handling very large files.
  • Timeout handling is essential to prevent blocking operations, especially when dealing with slow disks, network file systems, or large datasets.
  • Using context.WithTimeout is the recommended approach for production-grade applications to avoid goroutine leaks.

Official Documentation

Deepak Prasad

Deepak Prasad

R&D Engineer

Founder of GoLinuxCloud with over a decade of expertise in Linux, Python, Go, Laravel, DevOps, Kubernetes, Git, Shell scripting, OpenShift, AWS, Networking, and Security. With extensive experience, he excels across development, DevOps, networking, and security, delivering robust and efficient solutions for diverse projects.