Re: nhc98-1.{00,01} produce crashing programs



About this list Date view Thread view Subject view Author view

Marcin 'Qrczak' Kowalczyk (qrczak@knm.org.pl)
3 Jan 2001 23:30:48 GMT


Mon, 1 Jan 2001 19:36:15 +0000, Malcolm Wallace <malcolm-nhc@cs.york.ac.uk> pisze: > It's a one-line change in the compiler, with a corresponding one-line > change in GreenCard as well. Committed to CVS today. It works now, thanks. > Int/Word 8/16/64 are not really supported properly at all right now. With QForeign they work well enough that the Curses interface works. QForeign provides Num/Integral/etc. instances, and Int64/Word64 are not used much. I miss foreign export dynamic the most. Then the fact that case does not work for arbitrary numeric types. Then foreign label and foreign import dynamic. addForeignPtrFinalizer is not emulated under nhc because it does not provide addForeignFinalizer :: ForeignObj -> IO () -> IO () (this function is not so important). QForeign comes with part-of-Parsec and GetOpt from ghc, which are needed by it but not distributed with nhc. It also emulates ghc's Bits module with Bits class; nhc's Bit is different. nhc does not provide CTypes and CTypesISO modules, nor HsFFI.h - all are provided by QForeign. Perhaps some of these should be provided by nhc itself. (QForeign's CTypes and CTypesISO use type synonyms, where they should better be newtypes.) 8-bit Chars cause my Curses interface to use ASCII substitutes for semigraphics. In Unicode-enabled Haskell implementations (i.e. development versions of ghc) it represents semigraphics as appropriate Unicode characters, and converts them to curses' chtype while displaying, to let the interface use plain String (these characters would have to be converted from some concrete representation anyway, because chtype values are available as stateful macros, so I just used Unicode Char values). But proper Unicode support requires some IO interface for handling charsets which is not officially designed yet. I use hmake to generate dependencies included into Makefiles. Files are spread over serveral directories. hmake has -I and -P options, but neither is enough for the following way of compilation which is used in QForeign. You may want to skip the next paragraph, which only explains why (I think) I need it. I'm afraid it would be too hard to use hmake for drive the whole compilation. Some source files are generated by a preprocessor. Included .depend files themselves depend on automatically generated sources. These facts imply that make begins anything with ensuring that Makefile and files included into it are up to date, so must run the preprocessor. It all happens automatically. But I sometimes want to just run everything needed to make Makefiles up to date, because otherwise even make clean does not work, because make tries to run the interpreter which may not be compiled yet. This target is called boot. It enters each subdirectory, and in each of them it runs the preprocessor and generates dependencies. This means that files for which I generate dependencies import modules from other directories which are not compiled yet. If I pass these directories in -I options to hmake, it generates dependencies which say that the resulting object files for modules from other directories will be put in the current directory. OTOH if I pass them in -P option, hmake complains that it does not find .hi files. What I want to say is that sources of these files are in different directories, and their *.o and *.hi files don't exist yet, but they will be in the directory where the sources reside. Is it possible to let hmake generate dependencies with that assumption? Or could hmake be changed to support such scheme? -- __("< Marcin Kowalczyk * qrczak@knm.org.pl http://qrczak.ids.net.pl/ \__/ ^^ SYGNATURA ZASTĘPCZA QRCZAK


About this list Date view Thread view Subject view Author view