Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


segmentation fault on large files

Andrew EnglandAndrew England Member Posts: 2

I have a code that works to add new lines in data files, but will come up with a segmentation fault error when the file is very large (hundreds of thousands of lines). Do I just need to increase MAXLINE, or should I use malloc() for some of the variables here, in which case, which variables and where. Any advice would be greatly appreciated! (btw, i am fairly new to C so am just learning)

'#include <string.h>
'#include <math.h>
'#include <stdio.h>
'#define MAXLINE 200

int main()
printf("enter file name: ");
char filename[MAXLINE];
scanf("%s", filename);

FILE *newfile = fopen(filename, "r");
FILE *tempfile = fopen("tempfile", "w");     

int ch, nlines = 0;
while ((ch = fgetc(newfile)) != EOF)
        if (ch == '\n')

    float doub[nlines];
    char line[MAXLINE], rest[nlines][MAXLINE];


    for (int i = 0; i < nlines; i++) 
        fgets(line, MAXLINE, newfile);
        sscanf(line, "%f %s", &doub[i], rest[i]);

        if (i > 0 && doub[i] > doub[i-1])
        fputc('\n', tempfile);
        fputs(line, tempfile);

return 0;



Sign In or Register to comment.