Tables

BinTableHDU

class astropy.io.fits.BinTableHDU(data=None, header=None, name=None) [edit on github]

Bases: astropy.io.fits.hdu.table._TableBaseHDU

Binary table HDU class.

add_checksum(when=None, override_datasum=False, blocking='standard') [edit on github]

Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.

Parameters :

when : str, optional

comment string for the cards; by default the comments will represent the time when the checksum was calculated

override_datasum : bool, optional

add the CHECKSUM card only

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Notes

For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True. This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.

add_datasum(when=None, blocking='standard') [edit on github]

Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.

Parameters :

when : str, optional

Comment string for the card that by default represents the time when the checksum was calculated

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Returns :

checksum : int

The calculated datasum

Notes

For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.

copy() [edit on github]

Make a copy of the table HDU, both header and data are copied.

dump(datafile=None, cdfile=None, hfile=None, clobber=False) [edit on github]

Dump the table HDU to a file in ASCII format. The table may be dumped in three separate files, one containing column definitions, one containing header parameters, and one for table data.

Parameters :

datafile : file path, file object or file-like object, optional

Output data file. The default is the root name of the fits file associated with this HDU appended with the extension .txt.

cdfile : file path, file object or file-like object, optional

Output column definitions file. The default is None, no column definitions output is produced.

hfile : file path, file object or file-like object, optional

Output header parameters file. The default is None, no header parameters output is produced.

clobber : bool

Overwrite the output files if they exist.

Notes

The primary use for the dump method is to allow viewing and editing the table data and parameters in a standard text editor. The load method can be used to create a new table from the three plain text (ASCII) files.

  • datafile: Each line of the data file represents one row of table data. The data is output one column at a time in column order. If a column contains an array, each element of the column array in the current row is output before moving on to the next column. Each row ends with a new line.

    Integer data is output right-justified in a 21-character field followed by a blank. Floating point data is output right justified using ‘g’ format in a 21-character field with 15 digits of precision, followed by a blank. String data that does not contain whitespace is output left-justified in a field whose width matches the width specified in the TFORM header parameter for the column, followed by a blank. When the string data contains whitespace characters, the string is enclosed in quotation marks (""). For the last data element in a row, the trailing blank in the field is replaced by a new line character.

    For column data containing variable length arrays (‘P’ format), the array data is preceded by the string 'VLA_Length= ' and the integer length of the array for that row, left-justified in a 21-character field, followed by a blank.

    For column data representing a bit field (‘X’ format), each bit value in the field is output right-justified in a 21-character field as 1 (for true) or 0 (for false).

  • cdfile: Each line of the column definitions file provides the definitions for one column in the table. The line is broken up into 8, sixteen-character fields. The first field provides the column name (TTYPEn). The second field provides the column format (TFORMn). The third field provides the display format (TDISPn). The fourth field provides the physical units (TUNITn). The fifth field provides the dimensions for a multidimensional array (TDIMn). The sixth field provides the value that signifies an undefined value (TNULLn). The seventh field provides the scale factor (TSCALn). The eighth field provides the offset value (TZEROn). A field value of "" is used to represent the case where no value is provided.

  • hfile: Each line of the header parameters file provides the definition of a single HDU header card as represented by the card image.

filebytes() [edit on github]

Calculates and returns the number of bytes that this HDU will write to a file.

Parameters :None :
Returns :Number of bytes :
fileinfo() [edit on github]

Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList.

Parameters :

None :

Returns :

dictionary or None :

The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.

Dictionary contents:

Key Value
file File object associated with the HDU
filemode Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)
hdrLoc Starting byte location of header in file
datLoc Starting byte location of data block in file
datSpan Data size including padding
classmethod fromstring(data, fileobj=None, offset=0, checksum=False, ignore_missing_end=False, **kwargs) [edit on github]

Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.

Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a memoryview.

Parameters :

data : str, bytearray, memoryview, ndarray

A byte string contining the HDU’s header and, optionally, its data. If fileobj is not specified, and the length of data extends beyond the header, then the trailing data is taken to be the HDU’s data. If fileobj is specified then the trailing data is ignored.

fileobj : file (optional)

The file-like object that this HDU was read from.

offset : int (optional)

If fileobj is specified, the offset into the file-like object at which this HDU begins.

checksum : bool (optional)

Check the HDU’s checksum and/or datasum.

ignore_missing_end : bool (optional)

Ignore a missing end card in the header data. Note that without the end card the end of the header can’t be found, so the entire data is just assumed to be the header.

kwargs : (optional)

May contain additional keyword arguments specific to an HDU type. Any unrecognized kwargs are simply ignored.

get_coldefs(*args, **kwargs) [edit on github]

Deprecated since version 3.0: Use the columns attribute instead.

Returns the table’s column definitions.

classmethod load(datafile, cdfile=None, hfile=None, replace=False, header=None) [edit on github]

Create a table from the input ASCII files. The input is from up to three separate files, one containing column definitions, one containing header parameters, and one containing column data.

The column definition and header parameters files are not required. When absent the column definitions and/or header parameters are taken from the header object given in the header argument; otherwise sensible defaults are inferred (though this mode is not recommended).

Parameters :

datafile : file path, file object or file-like object

Input data file containing the table data in ASCII format.

cdfile : file path, file object, file-like object, optional

Input column definition file containing the names, formats, display formats, physical units, multidimensional array dimensions, undefined values, scale factors, and offsets associated with the columns in the table. If None, the column definitions are taken from the current values in this object.

hfile : file path, file object, file-like object, optional

Input parameter definition file containing the header parameter definitions to be associated with the table. If None, the header parameter definitions are taken from the current values in this objects header.

replace : bool

When True, indicates that the entire header should be replaced with the contents of the ASCII file instead of just updating the current header.

header : Header object

When the cdfile and hfile are missing, use this Header object in the creation of the new table and HDU. Otherwise this Header supercedes the keywords from hfile, which is only used to update values not present in this Header, unless replace=True in which this Header’s values are completely replaced with the values from hfile.

Notes

The primary use for the load method is to allow the input of ASCII data that was edited in a standard text editor of the table data and parameters. The dump method can be used to create the initial ASCII files.

  • datafile: Each line of the data file represents one row of table data. The data is output one column at a time in column order. If a column contains an array, each element of the column array in the current row is output before moving on to the next column. Each row ends with a new line.

    Integer data is output right-justified in a 21-character field followed by a blank. Floating point data is output right justified using ‘g’ format in a 21-character field with 15 digits of precision, followed by a blank. String data that does not contain whitespace is output left-justified in a field whose width matches the width specified in the TFORM header parameter for the column, followed by a blank. When the string data contains whitespace characters, the string is enclosed in quotation marks (""). For the last data element in a row, the trailing blank in the field is replaced by a new line character.

    For column data containing variable length arrays (‘P’ format), the array data is preceded by the string 'VLA_Length= ' and the integer length of the array for that row, left-justified in a 21-character field, followed by a blank.

    For column data representing a bit field (‘X’ format), each bit value in the field is output right-justified in a 21-character field as 1 (for true) or 0 (for false).

  • cdfile: Each line of the column definitions file provides the definitions for one column in the table. The line is broken up into 8, sixteen-character fields. The first field provides the column name (TTYPEn). The second field provides the column format (TFORMn). The third field provides the display format (TDISPn). The fourth field provides the physical units (TUNITn). The fifth field provides the dimensions for a multidimensional array (TDIMn). The sixth field provides the value that signifies an undefined value (TNULLn). The seventh field provides the scale factor (TSCALn). The eighth field provides the offset value (TZEROn). A field value of "" is used to represent the case where no value is provided.

  • hfile: Each line of the header parameters file provides the definition of a single HDU header card as represented by the card image.

classmethod readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs) [edit on github]

Read the HDU from a file. Normally an HDU should be opened with fitsopen() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto().

Parameters :

fileobj : file object or file-like object

Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.

checksum : bool

If True, verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.

ignore_missing_end : bool

Do not issue an exception when opening a file that is missing an END card in the last header.

req_cards(keyword, pos, test, fix_value, option, errlist) [edit on github]

Check the existence, location, and value of a required Card.

TODO: Write about parameters

If pos = None, it can be anywhere. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.

run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True) [edit on github]

Execute the verification with selected option.

size

Size (in bytes) of the data portion of the HDU.

classmethod tcreate(*args, **kwargs) [edit on github]

Deprecated since version 3.1: Use load() instead.

tdump(*args, **kwargs) [edit on github]

Deprecated since version 3.1: Use dump() instead.

update() [edit on github]

Update header keywords to reflect recent changes of columns.

update_ext_name(value, comment=None, before=None, after=None, savecomment=False) [edit on github]

Update the extension name associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :

value : str

value to be used for the new extension name

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_ext_version(value, comment=None, before=None, after=None, savecomment=False) [edit on github]

Update the extension version associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :

value : str

value to be used for the new extension version

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

verify(option='warn') [edit on github]

Verify all values in the instance.

Parameters :

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". See Verification options for more info.

verify_checksum(blocking='standard') [edit on github]

Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no CHECKSUM keyword present
verify_datasum(blocking='standard') [edit on github]

Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no DATASUM keyword present
writeto(name, output_verify='exception', clobber=False, checksum=False) [edit on github]

Works similarly to the normal writeto(), but prepends a default PrimaryHDU are required by extension HDUs (which cannot stand on their own).

TableHDU

class astropy.io.fits.TableHDU(data=None, header=None, name=None) [edit on github]

Bases: astropy.io.fits.hdu.table._TableBaseHDU

FITS ASCII table extension HDU class.

add_checksum(when=None, override_datasum=False, blocking='standard') [edit on github]

Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.

Parameters :

when : str, optional

comment string for the cards; by default the comments will represent the time when the checksum was calculated

override_datasum : bool, optional

add the CHECKSUM card only

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Notes

For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True. This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.

add_datasum(when=None, blocking='standard') [edit on github]

Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.

Parameters :

when : str, optional

Comment string for the card that by default represents the time when the checksum was calculated

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Returns :

checksum : int

The calculated datasum

Notes

For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.

copy() [edit on github]

Make a copy of the table HDU, both header and data are copied.

filebytes() [edit on github]

Calculates and returns the number of bytes that this HDU will write to a file.

Parameters :None :
Returns :Number of bytes :
fileinfo() [edit on github]

Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList.

Parameters :

None :

Returns :

dictionary or None :

The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.

Dictionary contents:

Key Value
file File object associated with the HDU
filemode Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)
hdrLoc Starting byte location of header in file
datLoc Starting byte location of data block in file
datSpan Data size including padding
classmethod fromstring(data, fileobj=None, offset=0, checksum=False, ignore_missing_end=False, **kwargs) [edit on github]

Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.

Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a memoryview.

Parameters :

data : str, bytearray, memoryview, ndarray

A byte string contining the HDU’s header and, optionally, its data. If fileobj is not specified, and the length of data extends beyond the header, then the trailing data is taken to be the HDU’s data. If fileobj is specified then the trailing data is ignored.

fileobj : file (optional)

The file-like object that this HDU was read from.

offset : int (optional)

If fileobj is specified, the offset into the file-like object at which this HDU begins.

checksum : bool (optional)

Check the HDU’s checksum and/or datasum.

ignore_missing_end : bool (optional)

Ignore a missing end card in the header data. Note that without the end card the end of the header can’t be found, so the entire data is just assumed to be the header.

kwargs : (optional)

May contain additional keyword arguments specific to an HDU type. Any unrecognized kwargs are simply ignored.

get_coldefs(*args, **kwargs) [edit on github]

Deprecated since version 3.0: Use the columns attribute instead.

Returns the table’s column definitions.

classmethod readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs) [edit on github]

Read the HDU from a file. Normally an HDU should be opened with fitsopen() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto().

Parameters :

fileobj : file object or file-like object

Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.

checksum : bool

If True, verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.

ignore_missing_end : bool

Do not issue an exception when opening a file that is missing an END card in the last header.

req_cards(keyword, pos, test, fix_value, option, errlist) [edit on github]

Check the existence, location, and value of a required Card.

TODO: Write about parameters

If pos = None, it can be anywhere. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.

run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True) [edit on github]

Execute the verification with selected option.

size

Size (in bytes) of the data portion of the HDU.

update() [edit on github]

Update header keywords to reflect recent changes of columns.

update_ext_name(value, comment=None, before=None, after=None, savecomment=False) [edit on github]

Update the extension name associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :

value : str

value to be used for the new extension name

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_ext_version(value, comment=None, before=None, after=None, savecomment=False) [edit on github]

Update the extension version associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :

value : str

value to be used for the new extension version

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

verify(option='warn') [edit on github]

Verify all values in the instance.

Parameters :

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". See Verification options for more info.

verify_checksum(blocking='standard') [edit on github]

Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no CHECKSUM keyword present
verify_datasum(blocking='standard') [edit on github]

Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no DATASUM keyword present
writeto(name, output_verify='exception', clobber=False, checksum=False) [edit on github]

Works similarly to the normal writeto(), but prepends a default PrimaryHDU are required by extension HDUs (which cannot stand on their own).

Column

class astropy.io.fits.Column(name=None, format=None, unit=None, null=None, bscale=None, bzero=None, disp=None, start=None, dim=None, array=None) [edit on github]

Bases: object

Class which contains the definition of one column, e.g. ttype, tform, etc. and the array containing values for the column. Does not support theap yet.

copy() [edit on github]

Return a copy of this Column.

ColDefs

class astropy.io.fits.ColDefs(input, tbtype='BinTableHDU') [edit on github]

Bases: object

Column definitions class.

It has attributes corresponding to the Column attributes (e.g. ColDefs has the attribute names while Column has name). Each attribute in ColDefs is a list of corresponding attribute values from all Column objects.

add_col(column) [edit on github]

Append one Column to the column definition.

change_attrib(col_name, attrib, new_value) [edit on github]

Change an attribute (in the commonName list) of a Column.

col_name : str or int
The column name or index to change
attrib : str
The attribute name
value : object
The new value for the attribute
change_name(col_name, new_name) [edit on github]

Change a Column‘s name.

col_name : str
The current name of the column
new_name : str
The new name of the column
change_unit(col_name, new_unit) [edit on github]

Change a Column‘s unit.

col_name : str or int
The column name or index
new_unit : str
The new unit for the column
data

Deprecated since version 3.0: The data attribute is deprecated; use the ColDefs.columns attribute instead.

What was originally self.columns is now self.data; this provides some backwards compatibility.

del_col(col_name) [edit on github]

Delete (the definition of) one Column.

col_name : str or int
The column’s name or index
info(attrib='all', output=None) [edit on github]

Get attribute(s) information of the column definition.

Parameters :

attrib : str

Can be one or more of the attributes listed in KEYWORD_ATTRIBUTES. The default is "all" which will print out all attributes. It forgives plurals and blanks. If there are two or more attribute names, they must be separated by comma(s).

output : file, optional

File-like object to output to. Outputs to stdout by default. If False, returns the attributes as a dict instead.

Notes

This function doesn’t return anything by default; it just prints to stdout.

FITS_rec

class astropy.io.fits.FITS_rec [edit on github]

Bases: numpy.core.records.recarray

FITS record array class.

FITS_rec is the data part of a table HDU’s data part. This is a layer over the recarray, so we can deal with scaled columns.

It inherits all of the standard methods from numpy.ndarray.

columns

A user-visible accessor for the coldefs. See ticket #44.

field(key) [edit on github]

A view of a Column‘s data as an array.

FITS_record

class astropy.io.fits.FITS_record(input, row=0, start=None, end=None, step=None, base=None, **kwargs) [edit on github]

Bases: object

FITS record class.

FITS_record is used to access records of the FITS_rec object. This will allow us to deal with scaled columns. It also handles conversion/scaling of columns in ASCII tables. The FITS_record class expects a FITS_rec object as input.

field(field) [edit on github]

Get the field data of the record.

setfield(field, value) [edit on github]

Set the field data of the record.

Table Functions

new_table()

astropy.io.fits.new_table(input, header=None, nrows=0, fill=False, tbtype='BinTableHDU') [edit on github]

Create a new table from the input column definitions.

Warning: Creating a new table using this method creates an in-memory copy of all the column arrays in the input. This is because if they are separate arrays they must be combined into a single contiguous array.

If the column data is already in a single contiguous array (such as an existing record array) it may be better to create a BinTableHDU instance directly. See the PyFITS documentation for more details.

Parameters :

input : sequence of Column or ColDefs objects

The data to create a table from.

header : Header instance

Header to be used to populate the non-required keywords.

nrows : int

Number of rows in the new table.

fill : bool

If True, will fill all cells with zeros or blanks. If False, copy the data from input, undefined cells will still be filled with zeros/blanks.

tbtype : str

Table type to be created (“BinTableHDU” or “TableHDU”).

tabledump()

astropy.io.fits.tabledump(filename, datafile=None, cdfile=None, hfile=None, ext=1, clobber=False) [edit on github]

Dump a table HDU to a file in ASCII format. The table may be dumped in three separate files, one containing column definitions, one containing header parameters, and one for table data.

Parameters :

filename : file path, file object or file-like object

Input fits file.

datafile : file path, file object or file-like object (optional)

Output data file. The default is the root name of the input fits file appended with an underscore, followed by the extension number (ext), followed by the extension .txt.

cdfile : file path, file object or file-like object (optional)

Output column definitions file. The default is None, no column definitions output is produced.

hfile : file path, file object or file-like object (optional)

Output header parameters file. The default is None, no header parameters output is produced.

ext : int

The number of the extension containing the table HDU to be dumped.

clobber : bool

Overwrite the output files if they exist.

Notes

The primary use for the tabledump function is to allow editing in a standard text editor of the table data and parameters. The tcreate function can be used to reassemble the table from the three ASCII files.

  • datafile: Each line of the data file represents one row of table data. The data is output one column at a time in column order. If a column contains an array, each element of the column array in the current row is output before moving on to the next column. Each row ends with a new line.

    Integer data is output right-justified in a 21-character field followed by a blank. Floating point data is output right justified using ‘g’ format in a 21-character field with 15 digits of precision, followed by a blank. String data that does not contain whitespace is output left-justified in a field whose width matches the width specified in the TFORM header parameter for the column, followed by a blank. When the string data contains whitespace characters, the string is enclosed in quotation marks (""). For the last data element in a row, the trailing blank in the field is replaced by a new line character.

    For column data containing variable length arrays (‘P’ format), the array data is preceded by the string 'VLA_Length= ' and the integer length of the array for that row, left-justified in a 21-character field, followed by a blank.

    For column data representing a bit field (‘X’ format), each bit value in the field is output right-justified in a 21-character field as 1 (for true) or 0 (for false).

  • cdfile: Each line of the column definitions file provides the definitions for one column in the table. The line is broken up into 8, sixteen-character fields. The first field provides the column name (TTYPEn). The second field provides the column format (TFORMn). The third field provides the display format (TDISPn). The fourth field provides the physical units (TUNITn). The fifth field provides the dimensions for a multidimensional array (TDIMn). The sixth field provides the value that signifies an undefined value (TNULLn). The seventh field provides the scale factor (TSCALn). The eighth field provides the offset value (TZEROn). A field value of "" is used to represent the case where no value is provided.

  • hfile: Each line of the header parameters file provides the definition of a single HDU header card as represented by the card image.

tableload()

astropy.io.fits.tableload(datafile, cdfile, hfile=None) [edit on github]

Create a table from the input ASCII files. The input is from up to three separate files, one containing column definitions, one containing header parameters, and one containing column data. The header parameters file is not required. When the header parameters file is absent a minimal header is constructed.

Parameters :

datafile : file path, file object or file-like object

Input data file containing the table data in ASCII format.

cdfile : file path, file object or file-like object

Input column definition file containing the names, formats, display formats, physical units, multidimensional array dimensions, undefined values, scale factors, and offsets associated with the columns in the table.

hfile : file path, file object or file-like object (optional)

Input parameter definition file containing the header parameter definitions to be associated with the table. If None, a minimal header is constructed.

Notes

The primary use for the tableload function is to allow the input of ASCII data that was edited in a standard text editor of the table data and parameters. The tabledump function can be used to create the initial ASCII files.

  • datafile: Each line of the data file represents one row of table data. The data is output one column at a time in column order. If a column contains an array, each element of the column array in the current row is output before moving on to the next column. Each row ends with a new line.

    Integer data is output right-justified in a 21-character field followed by a blank. Floating point data is output right justified using ‘g’ format in a 21-character field with 15 digits of precision, followed by a blank. String data that does not contain whitespace is output left-justified in a field whose width matches the width specified in the TFORM header parameter for the column, followed by a blank. When the string data contains whitespace characters, the string is enclosed in quotation marks (""). For the last data element in a row, the trailing blank in the field is replaced by a new line character.

    For column data containing variable length arrays (‘P’ format), the array data is preceded by the string 'VLA_Length= ' and the integer length of the array for that row, left-justified in a 21-character field, followed by a blank.

    For column data representing a bit field (‘X’ format), each bit value in the field is output right-justified in a 21-character field as 1 (for true) or 0 (for false).

  • cdfile: Each line of the column definitions file provides the definitions for one column in the table. The line is broken up into 8, sixteen-character fields. The first field provides the column name (TTYPEn). The second field provides the column format (TFORMn). The third field provides the display format (TDISPn). The fourth field provides the physical units (TUNITn). The fifth field provides the dimensions for a multidimensional array (TDIMn). The sixth field provides the value that signifies an undefined value (TNULLn). The seventh field provides the scale factor (TSCALn). The eighth field provides the offset value (TZEROn). A field value of "" is used to represent the case where no value is provided.

  • hfile: Each line of the header parameters file provides the definition of a single HDU header card as represented by the card image.