< All Topics

Creating discharge points from a CSV file

If you’ve got a set of deposition points in a CSV and you want to import then and create separate files from them (so you can pour tailings from individual points in single or multi-stream deposition), there are a few options.

One would be to import a point, one by one using Data import/Text data/Paste points from clipboard, and then manually paste each points coordinate into the command and then save the layer.  Another would be to use Polyline/Create/Draw a line, manually enter point coordinates, then save it.  Both these are tedious.

Instead, below we’ve created a script that will read data from a CSV/Text file, creates a new layer from each point, and then saves them.

import csv
from muk3d.geometry.polyline import PolyLine
from muk3d.file import save_geometry

# the CSV file to load.  Expecting to find just xyz coordinates.  
csv_filename = 'points.csv'

with open(csv_filename) as csvfile:
    # work out what the delimiter for the CSV file is.  It could be spaces, tabs, or commas, but this tries to
    # make an educated guess.
    dialect = csv.Sniffer().sniff(csvfile.read(1024))
    # go back to the start of the CSV file, otherwise the CSV will be read from the 1024 th character.
    # create the CSV reader
    csvreader = csv.reader(csvfile, dialect)
    # loop through each row in the CSV.   The enumerate function will return the row number,
    # in addition to the row data.  Row numbering starts at 0.
    for i, row in enumerate(csvreader):        
        # convert the values in the row to floating point numbers
        x,y,z = map(float, row)
        # create a polyline with just a single point.  We create a PolyLine instead of Points so
        # that we can use these layers in a deposition model.
        pl = PolyLine.From_Points([[x,y,z],])
        # want point numbering to start at 1
        point_number = i + 1  
        # save the point to an mcurve file
        save_geometry(pl, 'discharge-{:03d}.mcurve'.format(point_number))

The explanation

This code relies on 3 imports:

  • the csv module is part of the Python standard library and simplifies the process of importing CSV files, or text files with comma/space/tab delimited points.
  • PolyLine is a class from the Muk3D api that allows for the creation of PolyLine objects.
  • save_geometry is a function that allows for the saving of muk3d.geometry objects.

Line 6: In  this instance, the filename is hard coded in the script.  This could be replaced with a dialog box that prompts the user for the name of the CSV/text file to load.

Line 8: Opening the csv_file to read and the file handle is assigned to the variable csvfile. Because this is opened with the with statement, there is no reason to explicitly close the file which is necessary if we open the file using the syntax below.

file = open(csv_filename)

Line 12: To make this script versatile, we don’t want to hardwire what the delimiter is in the file being opened. The csv module has a class called Sniffer that attempts to determine useful information about the CSV file that helps parse it.  This includes whether it has a header row, and what the delimiter is.  In this case, its being fed the first 1024 characters of the CSV file.  The number is somewhat arbitrary, but it should be long enough to at least cover the first few rows of the text file.

dialect = csv.Sniffer().sniff(csvfile.read(1024))

The metadata that is stored in the dialect variable created can then be passed as an argument when we create the csv reader.

Line 15: When data is being read from a file piece-wise, as in line 12 with the csvfile.read(1024), if we continued reading data from the file, it would start at position 1024 and so the first 1024 characters wouldn’t be read by the csv reader.  Think of it as there being a marker that knows what has been read so far, and then as more data is requested, it continues on from where it left off.  File handles (the variable we get after using open(filename)) has a method called seek that moves the current reading location in a file to a specific location (for a text file, its the character index).  By calling csvfile.seek(0) it moves this location back to the first character of the file, so when the CSV starts to be read, it starts from the beginning.

Line 18: Create the CSV reader object. This takes the csv file handle created in line 8, as well as the metadata created by the Sniffer.

Line 22: Here we start iterating through each row in the CSV file.  If the sniffer found a header row, it would be omitted and it would start at the first row of data. 

for i, row in enumerate(csvreader): 

The line enumerate(csvreader) is a way of returning not just each data, but also a value indicating the row number (starting with row 0 for the first row)

Line 25: To row variable is a list of values (based on the delimiter selected) from the current row in the csv file. Each value in the list will be a string and its up to the user to convert the values into the appropriate data types. In this case we want the coordinates of each point so we turn the three string values into floating point numbers using float(value). Rather than explicitly doing this for each element in the list, we can use the map function. This function takes a list and applies another function to each value. In this case, it takes each value and passes it as an argument to the float function, turning the string into a floating point number.

Since there are 3 values in each row, we unpack the values into x, y, and z variables.

Line29: For each point, we want to create a separate polyline object. To do this we can use PolyLine.From_Points. This command expects a list of 3D points. Since we only have a single point, we create a list with just this point in it and pass it to the method, resulting in a new PolyLine that comprises just this point.

Line 32: We create a variable that represents the point number (with first point being point 1) by adding 1 to the row enumeration value.

Line 35: The PolyLine object is saved using the appropriate point number.

Once the script is complete, there will be a saved polyline for each point in the csv file.

Table of Contents