From xemacs-m  Wed Feb  5 23:32:10 1997
Received: from pentagana.sonic.jp (root@tokyo-05-185.gol.com [202.243.51.185])
	by xemacs.org (8.8.5/8.8.5) with ESMTP id XAA13154
	for <xemacs-beta@xemacs.org>; Wed, 5 Feb 1997 23:32:08 -0600 (CST)
Received: from mother.sonic.jp (mother.sonic.jp [194.93.1.1]) by pentagana.sonic.jp (8.7.1+2.6Wbeta4/3.4W3) with ESMTP id OAA04001 for <xemacs-beta@xemacs.org>; Thu, 6 Feb 1997 14:29:57 +0900
Received: from pentagana (jhod@[194.93.1.69]) by mother.sonic.jp with SMTP (8.7.1/8.7.1) id OAA04608 for <xemacs-beta@xemacs.org>; Thu, 6 Feb 1997 14:30:14 +0900 (JST)
Sender: jhod@mother.sonic.jp
Message-ID: <32F96C53.273F69AD@po.iijnet.or.jp>
Date: Thu, 06 Feb 1997 14:29:55 +0900
From: Jareth Hein <jhod@po.iijnet.or.jp>
Organization: Sonic Software Planning, Tokyo
X-Mailer: Mozilla 3.01 (X11; I; Linux 2.0.28 i586)
MIME-Version: 1.0
To: XEmacs Beta Mailing List <xemacs-beta@xemacs.org>
Subject: Mule coding systems, SJIS BUG!!!
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit

There is a showstopper bug in 20.0 (and has been for a long while, as
far as I can tell...) involving the SJIS coding system. SJIS
(Micky$loth's unfortunatly popular bastardization of JISX0201+JISX208)
data can be read in, auto-detected and properly decided and displayed,
but when written in SJIS the data is corrupted beyond recovery. The
following patch fixes an obvious flaw in the code, but even this does
not remove the problem. 

*** src/mule-coding.c.old       Thu Feb  6 14:13:54 1997
--- src/mule-coding.c   Thu Feb  6 14:14:36 1997
***************
*** 2486,2492 ****
  
  #define ENCODE_SJIS(c1, c2, sj1, sj2)                 \
  do {                                                  \
!   int I1 = c1, I2 = sj2;                              \
    if (I1 & 1)                                         \
      sj1 = (I1 >> 1) + ((I1 < 0xdf) ? 0x31 : 0x71),    \
      sj2 = I2 - ((I2 >= 0xe0) ? 0x60 : 0x61);          \
--- 2486,2492 ----
  
  #define ENCODE_SJIS(c1, c2, sj1, sj2)                 \
  do {                                                  \
!   int I1 = c1, I2 = c2;                                       \
    if (I1 & 1)                                         \
      sj1 = (I1 >> 1) + ((I1 < 0xdf) ? 0x31 : 0x71),    \
      sj2 = I2 - ((I2 >= 0xe0) ? 0x60 : 0x61);          \

I'm not familiar enough yet with the conversion engine to figure out why
it is failing, as the rest of the code looks correct to me. Martin?

As a secondary issue, it's my rather deepseated opinion that a buffer
should STAY in the coding system it was autodetected as. I don't like
the 'feature' that my current setup has where it changes everthing to
EUC or ISO8... Do I simply have something set up wrong? If so, I still
feel that conversions for any file-based buffers should not be
overridden without direct user intervention (exceptions being mail &
news, etc)

--Jareth

